26264 1727204235.60424: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-G1p executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 26264 1727204235.61142: Added group all to inventory 26264 1727204235.61144: Added group ungrouped to inventory 26264 1727204235.61148: Group all now contains ungrouped 26264 1727204235.61152: Examining possible inventory source: /tmp/network-M6W/inventory-5vW.yml 26264 1727204235.84629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 26264 1727204235.84806: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 26264 1727204235.84832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 26264 1727204235.85029: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 26264 1727204235.85261: Loaded config def from plugin (inventory/script) 26264 1727204235.85303: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 26264 1727204235.85344: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 26264 1727204235.85641: Loaded config def from plugin (inventory/yaml) 26264 1727204235.85647: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 26264 1727204235.85826: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 26264 1727204235.86331: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 26264 1727204235.86334: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 26264 1727204235.86337: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 26264 1727204235.86344: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 26264 1727204235.86348: Loading data from /tmp/network-M6W/inventory-5vW.yml 26264 1727204235.86431: /tmp/network-M6W/inventory-5vW.yml was not parsable by auto 26264 1727204235.86501: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 26264 1727204235.86552: Loading data from /tmp/network-M6W/inventory-5vW.yml 26264 1727204235.86641: group all already in inventory 26264 1727204235.86648: set inventory_file for managed-node1 26264 1727204235.86653: set inventory_dir for managed-node1 26264 1727204235.86654: Added host managed-node1 to inventory 26264 1727204235.86656: Added host managed-node1 to group all 26264 1727204235.86657: set ansible_host for managed-node1 26264 1727204235.86658: set ansible_ssh_extra_args for managed-node1 26264 1727204235.86661: set inventory_file for managed-node2 26264 1727204235.86666: set inventory_dir for managed-node2 26264 1727204235.86667: Added host managed-node2 to inventory 26264 1727204235.86668: Added host managed-node2 to group all 26264 1727204235.86669: set ansible_host for managed-node2 26264 1727204235.86670: set ansible_ssh_extra_args for managed-node2 26264 1727204235.86673: set inventory_file for managed-node3 26264 1727204235.86676: set inventory_dir for managed-node3 26264 1727204235.86676: Added host managed-node3 to inventory 26264 1727204235.86678: Added host managed-node3 to group all 26264 1727204235.86678: set ansible_host for managed-node3 26264 1727204235.86679: set ansible_ssh_extra_args for managed-node3 26264 1727204235.86682: Reconcile groups and hosts in inventory. 26264 1727204235.86686: Group ungrouped now contains managed-node1 26264 1727204235.86688: Group ungrouped now contains managed-node2 26264 1727204235.86689: Group ungrouped now contains managed-node3 26264 1727204235.86807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 26264 1727204235.86938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 26264 1727204235.86994: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 26264 1727204235.87022: Loaded config def from plugin (vars/host_group_vars) 26264 1727204235.87025: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 26264 1727204235.87031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 26264 1727204235.87039: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 26264 1727204235.87085: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 26264 1727204235.88117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204235.88534: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 26264 1727204235.88580: Loaded config def from plugin (connection/local) 26264 1727204235.88583: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 26264 1727204235.89632: Loaded config def from plugin (connection/paramiko_ssh) 26264 1727204235.89636: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 26264 1727204235.91152: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 26264 1727204235.91193: Loaded config def from plugin (connection/psrp) 26264 1727204235.91196: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 26264 1727204235.92117: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 26264 1727204235.92162: Loaded config def from plugin (connection/ssh) 26264 1727204235.92167: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 26264 1727204235.92647: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 26264 1727204235.92687: Loaded config def from plugin (connection/winrm) 26264 1727204235.92694: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 26264 1727204235.92725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 26264 1727204235.92788: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 26264 1727204235.92859: Loaded config def from plugin (shell/cmd) 26264 1727204235.92861: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 26264 1727204235.92888: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 26264 1727204235.93074: Loaded config def from plugin (shell/powershell) 26264 1727204235.93076: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 26264 1727204235.93242: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 26264 1727204235.93531: Loaded config def from plugin (shell/sh) 26264 1727204235.93534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 26264 1727204235.93679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 26264 1727204235.93986: Loaded config def from plugin (become/runas) 26264 1727204235.93988: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 26264 1727204235.95414: Loaded config def from plugin (become/su) 26264 1727204235.95443: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 26264 1727204235.95644: Loaded config def from plugin (become/sudo) 26264 1727204235.95646: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 26264 1727204235.95689: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 26264 1727204235.96054: in VariableManager get_vars() 26264 1727204235.96080: done with get_vars() 26264 1727204235.96258: trying /usr/local/lib/python3.12/site-packages/ansible/modules 26264 1727204236.00310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 26264 1727204236.00450: in VariableManager get_vars() 26264 1727204236.00455: done with get_vars() 26264 1727204236.00458: variable 'playbook_dir' from source: magic vars 26264 1727204236.00459: variable 'ansible_playbook_python' from source: magic vars 26264 1727204236.00459: variable 'ansible_config_file' from source: magic vars 26264 1727204236.00460: variable 'groups' from source: magic vars 26264 1727204236.00461: variable 'omit' from source: magic vars 26264 1727204236.00462: variable 'ansible_version' from source: magic vars 26264 1727204236.00463: variable 'ansible_check_mode' from source: magic vars 26264 1727204236.00468: variable 'ansible_diff_mode' from source: magic vars 26264 1727204236.00469: variable 'ansible_forks' from source: magic vars 26264 1727204236.00470: variable 'ansible_inventory_sources' from source: magic vars 26264 1727204236.00471: variable 'ansible_skip_tags' from source: magic vars 26264 1727204236.00471: variable 'ansible_limit' from source: magic vars 26264 1727204236.00472: variable 'ansible_run_tags' from source: magic vars 26264 1727204236.00473: variable 'ansible_verbosity' from source: magic vars 26264 1727204236.00509: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml 26264 1727204236.01222: in VariableManager get_vars() 26264 1727204236.01244: done with get_vars() 26264 1727204236.01289: in VariableManager get_vars() 26264 1727204236.01310: done with get_vars() 26264 1727204236.01356: in VariableManager get_vars() 26264 1727204236.01371: done with get_vars() 26264 1727204236.01402: in VariableManager get_vars() 26264 1727204236.01414: done with get_vars() 26264 1727204236.01496: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 26264 1727204236.01728: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 26264 1727204236.02074: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 26264 1727204236.02793: in VariableManager get_vars() 26264 1727204236.02813: done with get_vars() 26264 1727204236.03263: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 26264 1727204236.03417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 26264 1727204236.04770: in VariableManager get_vars() 26264 1727204236.04789: done with get_vars() 26264 1727204236.04933: in VariableManager get_vars() 26264 1727204236.04937: done with get_vars() 26264 1727204236.04939: variable 'playbook_dir' from source: magic vars 26264 1727204236.04940: variable 'ansible_playbook_python' from source: magic vars 26264 1727204236.04941: variable 'ansible_config_file' from source: magic vars 26264 1727204236.04942: variable 'groups' from source: magic vars 26264 1727204236.04942: variable 'omit' from source: magic vars 26264 1727204236.04943: variable 'ansible_version' from source: magic vars 26264 1727204236.04944: variable 'ansible_check_mode' from source: magic vars 26264 1727204236.04945: variable 'ansible_diff_mode' from source: magic vars 26264 1727204236.04945: variable 'ansible_forks' from source: magic vars 26264 1727204236.04946: variable 'ansible_inventory_sources' from source: magic vars 26264 1727204236.04947: variable 'ansible_skip_tags' from source: magic vars 26264 1727204236.04950: variable 'ansible_limit' from source: magic vars 26264 1727204236.04951: variable 'ansible_run_tags' from source: magic vars 26264 1727204236.04952: variable 'ansible_verbosity' from source: magic vars 26264 1727204236.04988: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 26264 1727204236.05070: in VariableManager get_vars() 26264 1727204236.05074: done with get_vars() 26264 1727204236.05076: variable 'playbook_dir' from source: magic vars 26264 1727204236.05077: variable 'ansible_playbook_python' from source: magic vars 26264 1727204236.05078: variable 'ansible_config_file' from source: magic vars 26264 1727204236.05078: variable 'groups' from source: magic vars 26264 1727204236.05079: variable 'omit' from source: magic vars 26264 1727204236.05080: variable 'ansible_version' from source: magic vars 26264 1727204236.05081: variable 'ansible_check_mode' from source: magic vars 26264 1727204236.05081: variable 'ansible_diff_mode' from source: magic vars 26264 1727204236.05082: variable 'ansible_forks' from source: magic vars 26264 1727204236.05083: variable 'ansible_inventory_sources' from source: magic vars 26264 1727204236.05084: variable 'ansible_skip_tags' from source: magic vars 26264 1727204236.05084: variable 'ansible_limit' from source: magic vars 26264 1727204236.05085: variable 'ansible_run_tags' from source: magic vars 26264 1727204236.05086: variable 'ansible_verbosity' from source: magic vars 26264 1727204236.05117: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 26264 1727204236.05209: in VariableManager get_vars() 26264 1727204236.05220: done with get_vars() 26264 1727204236.05275: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 26264 1727204236.05396: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 26264 1727204236.05484: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 26264 1727204236.05918: in VariableManager get_vars() 26264 1727204236.05938: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 26264 1727204236.07597: in VariableManager get_vars() 26264 1727204236.07616: done with get_vars() 26264 1727204236.07659: in VariableManager get_vars() 26264 1727204236.07662: done with get_vars() 26264 1727204236.07666: variable 'playbook_dir' from source: magic vars 26264 1727204236.07667: variable 'ansible_playbook_python' from source: magic vars 26264 1727204236.07668: variable 'ansible_config_file' from source: magic vars 26264 1727204236.07668: variable 'groups' from source: magic vars 26264 1727204236.07669: variable 'omit' from source: magic vars 26264 1727204236.07670: variable 'ansible_version' from source: magic vars 26264 1727204236.07670: variable 'ansible_check_mode' from source: magic vars 26264 1727204236.07671: variable 'ansible_diff_mode' from source: magic vars 26264 1727204236.07672: variable 'ansible_forks' from source: magic vars 26264 1727204236.07673: variable 'ansible_inventory_sources' from source: magic vars 26264 1727204236.07673: variable 'ansible_skip_tags' from source: magic vars 26264 1727204236.07674: variable 'ansible_limit' from source: magic vars 26264 1727204236.07675: variable 'ansible_run_tags' from source: magic vars 26264 1727204236.07675: variable 'ansible_verbosity' from source: magic vars 26264 1727204236.07706: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 26264 1727204236.07780: in VariableManager get_vars() 26264 1727204236.07791: done with get_vars() 26264 1727204236.07829: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 26264 1727204236.07968: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 26264 1727204236.08056: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 26264 1727204236.08494: in VariableManager get_vars() 26264 1727204236.08513: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 26264 1727204236.10223: in VariableManager get_vars() 26264 1727204236.10240: done with get_vars() 26264 1727204236.10286: in VariableManager get_vars() 26264 1727204236.10299: done with get_vars() 26264 1727204236.10368: in VariableManager get_vars() 26264 1727204236.10383: done with get_vars() 26264 1727204236.10489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 26264 1727204236.10504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 26264 1727204236.10758: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 26264 1727204236.11061: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 26264 1727204236.11065: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 26264 1727204236.11097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 26264 1727204236.11121: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 26264 1727204236.11444: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 26264 1727204236.11516: Loaded config def from plugin (callback/default) 26264 1727204236.11519: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 26264 1727204236.14772: Loaded config def from plugin (callback/junit) 26264 1727204236.14775: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 26264 1727204236.14827: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 26264 1727204236.14907: Loaded config def from plugin (callback/minimal) 26264 1727204236.14910: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 26264 1727204236.14953: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 26264 1727204236.15028: Loaded config def from plugin (callback/tree) 26264 1727204236.15033: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 26264 1727204236.15152: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 26264 1727204236.15155: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethernet_nm.yml ************************************************ 10 plays in /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 26264 1727204236.15188: in VariableManager get_vars() 26264 1727204236.15201: done with get_vars() 26264 1727204236.15207: in VariableManager get_vars() 26264 1727204236.15216: done with get_vars() 26264 1727204236.15220: variable 'omit' from source: magic vars 26264 1727204236.15259: in VariableManager get_vars() 26264 1727204236.15275: done with get_vars() 26264 1727204236.15301: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethernet.yml' with nm as provider] ********* 26264 1727204236.15884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 26264 1727204236.15963: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 26264 1727204236.15996: getting the remaining hosts for this loop 26264 1727204236.15998: done getting the remaining hosts for this loop 26264 1727204236.16001: getting the next task for host managed-node3 26264 1727204236.16005: done getting next task for host managed-node3 26264 1727204236.16006: ^ task is: TASK: Gathering Facts 26264 1727204236.16008: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204236.16015: getting variables 26264 1727204236.16016: in VariableManager get_vars() 26264 1727204236.16027: Calling all_inventory to load vars for managed-node3 26264 1727204236.16030: Calling groups_inventory to load vars for managed-node3 26264 1727204236.16032: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204236.16043: Calling all_plugins_play to load vars for managed-node3 26264 1727204236.16059: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204236.16062: Calling groups_plugins_play to load vars for managed-node3 26264 1727204236.16095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204236.16139: done with get_vars() 26264 1727204236.16145: done getting variables 26264 1727204236.16211: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Tuesday 24 September 2024 14:57:16 -0400 (0:00:00.011) 0:00:00.011 ***** 26264 1727204236.16233: entering _queue_task() for managed-node3/gather_facts 26264 1727204236.16234: Creating lock for gather_facts 26264 1727204236.16579: worker is 1 (out of 1 available) 26264 1727204236.16589: exiting _queue_task() for managed-node3/gather_facts 26264 1727204236.16606: done queuing things up, now waiting for results queue to drain 26264 1727204236.16608: waiting for pending results... 26264 1727204236.17083: running TaskExecutor() for managed-node3/TASK: Gathering Facts 26264 1727204236.17859: in run() - task 0affcd87-79f5-5ff5-08b0-00000000007c 26264 1727204236.17880: variable 'ansible_search_path' from source: unknown 26264 1727204236.17918: calling self._execute() 26264 1727204236.17981: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204236.17992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204236.18003: variable 'omit' from source: magic vars 26264 1727204236.18098: variable 'omit' from source: magic vars 26264 1727204236.18134: variable 'omit' from source: magic vars 26264 1727204236.18174: variable 'omit' from source: magic vars 26264 1727204236.18221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204236.18260: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204236.18326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204236.18354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204236.18370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204236.18401: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204236.18409: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204236.18415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204236.18513: Set connection var ansible_pipelining to False 26264 1727204236.19175: Set connection var ansible_connection to ssh 26264 1727204236.19182: Set connection var ansible_shell_type to sh 26264 1727204236.19194: Set connection var ansible_shell_executable to /bin/sh 26264 1727204236.19205: Set connection var ansible_timeout to 10 26264 1727204236.19216: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204236.19245: variable 'ansible_shell_executable' from source: unknown 26264 1727204236.19254: variable 'ansible_connection' from source: unknown 26264 1727204236.19262: variable 'ansible_module_compression' from source: unknown 26264 1727204236.19270: variable 'ansible_shell_type' from source: unknown 26264 1727204236.19276: variable 'ansible_shell_executable' from source: unknown 26264 1727204236.19282: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204236.19289: variable 'ansible_pipelining' from source: unknown 26264 1727204236.19294: variable 'ansible_timeout' from source: unknown 26264 1727204236.19301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204236.19489: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204236.19505: variable 'omit' from source: magic vars 26264 1727204236.19513: starting attempt loop 26264 1727204236.19519: running the handler 26264 1727204236.19541: variable 'ansible_facts' from source: unknown 26264 1727204236.19567: _low_level_execute_command(): starting 26264 1727204236.19579: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204236.21385: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204236.21448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204236.21452: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204236.21455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204236.21636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204236.21723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204236.21727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204236.21796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204236.23422: stdout chunk (state=3): >>>/root <<< 26264 1727204236.23523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204236.23619: stderr chunk (state=3): >>><<< 26264 1727204236.23622: stdout chunk (state=3): >>><<< 26264 1727204236.23734: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204236.23738: _low_level_execute_command(): starting 26264 1727204236.23740: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204236.236435-26352-270796511036158 `" && echo ansible-tmp-1727204236.236435-26352-270796511036158="` echo /root/.ansible/tmp/ansible-tmp-1727204236.236435-26352-270796511036158 `" ) && sleep 0' 26264 1727204236.25092: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204236.25100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204236.25250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204236.25254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204236.25263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204236.25421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204236.25435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204236.25548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204236.25619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204236.27649: stdout chunk (state=3): >>>ansible-tmp-1727204236.236435-26352-270796511036158=/root/.ansible/tmp/ansible-tmp-1727204236.236435-26352-270796511036158 <<< 26264 1727204236.27759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204236.27844: stderr chunk (state=3): >>><<< 26264 1727204236.27848: stdout chunk (state=3): >>><<< 26264 1727204236.28171: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204236.236435-26352-270796511036158=/root/.ansible/tmp/ansible-tmp-1727204236.236435-26352-270796511036158 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204236.28174: variable 'ansible_module_compression' from source: unknown 26264 1727204236.28177: ANSIBALLZ: Using generic lock for ansible.legacy.setup 26264 1727204236.28179: ANSIBALLZ: Acquiring lock 26264 1727204236.28181: ANSIBALLZ: Lock acquired: 139841028923536 26264 1727204236.28183: ANSIBALLZ: Creating module 26264 1727204236.77198: ANSIBALLZ: Writing module into payload 26264 1727204236.77497: ANSIBALLZ: Writing module 26264 1727204236.77647: ANSIBALLZ: Renaming module 26264 1727204236.77654: ANSIBALLZ: Done creating module 26264 1727204236.77697: variable 'ansible_facts' from source: unknown 26264 1727204236.77703: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204236.77714: _low_level_execute_command(): starting 26264 1727204236.77720: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 26264 1727204236.79222: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204236.79228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204236.79279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204236.79283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 26264 1727204236.79301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204236.79307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204236.79312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204236.79395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204236.79423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204236.79502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204236.81604: stdout chunk (state=3): >>>PLATFORM <<< 26264 1727204236.81696: stdout chunk (state=3): >>>Linux FOUND <<< 26264 1727204236.81719: stdout chunk (state=3): >>>/usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 26264 1727204236.81887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204236.81936: stderr chunk (state=3): >>><<< 26264 1727204236.81939: stdout chunk (state=3): >>><<< 26264 1727204236.81959: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204236.81972 [managed-node3]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 26264 1727204236.82017: _low_level_execute_command(): starting 26264 1727204236.82022: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 26264 1727204236.82410: Sending initial data 26264 1727204236.82415: Sent initial data (1181 bytes) 26264 1727204236.82964: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204236.82972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204236.83017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204236.83023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 26264 1727204236.83041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204236.83046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204236.83060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204236.83147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204236.83167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204236.83237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204236.87560: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 26264 1727204236.88073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204236.88077: stdout chunk (state=3): >>><<< 26264 1727204236.88084: stderr chunk (state=3): >>><<< 26264 1727204236.88099: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204236.88182: variable 'ansible_facts' from source: unknown 26264 1727204236.88186: variable 'ansible_facts' from source: unknown 26264 1727204236.88196: variable 'ansible_module_compression' from source: unknown 26264 1727204236.88242: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 26264 1727204236.88277: variable 'ansible_facts' from source: unknown 26264 1727204236.88437: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204236.236435-26352-270796511036158/AnsiballZ_setup.py 26264 1727204236.88772: Sending initial data 26264 1727204236.88776: Sent initial data (153 bytes) 26264 1727204236.89821: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204236.89834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204236.89841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204236.89859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204236.89901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204236.89913: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204236.89924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204236.89939: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204236.89946: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204236.89956: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204236.89965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204236.89975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204236.89986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204236.89994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204236.90000: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204236.90009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204236.90086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204236.90105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204236.90117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204236.90217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204236.92022: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204236.92060: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204236.92105: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpumbk21_t /root/.ansible/tmp/ansible-tmp-1727204236.236435-26352-270796511036158/AnsiballZ_setup.py <<< 26264 1727204236.92142: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204236.95301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204236.95402: stderr chunk (state=3): >>><<< 26264 1727204236.95405: stdout chunk (state=3): >>><<< 26264 1727204236.95431: done transferring module to remote 26264 1727204236.95447: _low_level_execute_command(): starting 26264 1727204236.95455: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204236.236435-26352-270796511036158/ /root/.ansible/tmp/ansible-tmp-1727204236.236435-26352-270796511036158/AnsiballZ_setup.py && sleep 0' 26264 1727204236.96974: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204236.96984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204236.97106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204236.97110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 26264 1727204236.97246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204236.97250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204236.97263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204236.97269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204236.97354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204236.97361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204236.97439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204236.99274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204236.99345: stderr chunk (state=3): >>><<< 26264 1727204236.99349: stdout chunk (state=3): >>><<< 26264 1727204236.99374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204236.99378: _low_level_execute_command(): starting 26264 1727204236.99383: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204236.236435-26352-270796511036158/AnsiballZ_setup.py && sleep 0' 26264 1727204237.00042: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204237.00055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204237.00068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204237.00105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204237.00137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204237.00144: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204237.00301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204237.00305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204237.00307: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204237.00309: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204237.00311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204237.00313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204237.00315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204237.00317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204237.00319: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204237.00321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204237.00350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204237.00355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204237.00357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204237.00471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204237.02450: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 26264 1727204237.02460: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 26264 1727204237.02524: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 26264 1727204237.02554: stdout chunk (state=3): >>>import 'posix' # <<< 26264 1727204237.02585: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 26264 1727204237.02635: stdout chunk (state=3): >>>import 'time' # <<< 26264 1727204237.02638: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 26264 1727204237.02691: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204237.02711: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 26264 1727204237.02733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 26264 1727204237.02758: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a9b3dc0> <<< 26264 1727204237.02809: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 26264 1727204237.02842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a9583a0> <<< 26264 1727204237.02856: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a9b3b20> <<< 26264 1727204237.02898: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a9b3ac0> <<< 26264 1727204237.02913: stdout chunk (state=3): >>>import '_signal' # <<< 26264 1727204237.02940: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a958490> <<< 26264 1727204237.02990: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 26264 1727204237.03008: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # <<< 26264 1727204237.03020: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a958940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a958670> <<< 26264 1727204237.03055: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 26264 1727204237.03102: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 26264 1727204237.03113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 26264 1727204237.03141: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 26264 1727204237.03155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 26264 1727204237.03187: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a90f190> <<< 26264 1727204237.03211: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 26264 1727204237.03214: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 26264 1727204237.03280: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a90f220> <<< 26264 1727204237.03312: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 26264 1727204237.03355: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a932850> <<< 26264 1727204237.03358: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a90f940> <<< 26264 1727204237.03417: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a970880> <<< 26264 1727204237.03434: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a908d90> <<< 26264 1727204237.03477: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a932d90> <<< 26264 1727204237.04009: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a958970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 26264 1727204237.04094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 26264 1727204237.04129: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8aff10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8b40a0> <<< 26264 1727204237.04160: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 26264 1727204237.04179: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 26264 1727204237.04233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8a75b0> <<< 26264 1727204237.04294: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8ae6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8af3d0> <<< 26264 1727204237.04299: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 26264 1727204237.04359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 26264 1727204237.04375: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 26264 1727204237.04398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204237.04421: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 26264 1727204237.04494: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a831e50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a831940> <<< 26264 1727204237.04497: stdout chunk (state=3): >>>import 'itertools' # <<< 26264 1727204237.04524: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a831f40> <<< 26264 1727204237.04561: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 26264 1727204237.04608: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a831d90> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a842100> <<< 26264 1727204237.04621: stdout chunk (state=3): >>>import '_collections' # <<< 26264 1727204237.04676: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a889dc0> import '_functools' # <<< 26264 1727204237.04721: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8826a0> <<< 26264 1727204237.04759: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a895700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8b5eb0> <<< 26264 1727204237.04791: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 26264 1727204237.04827: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a842d00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8892e0> <<< 26264 1727204237.04872: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a895310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8bba60> <<< 26264 1727204237.04907: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 26264 1727204237.04961: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204237.04989: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 26264 1727204237.04996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a842ee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a842e20> <<< 26264 1727204237.05030: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a842d90> <<< 26264 1727204237.05066: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 26264 1727204237.05071: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 26264 1727204237.05084: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 26264 1727204237.05109: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 26264 1727204237.05167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 26264 1727204237.05192: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a572400> <<< 26264 1727204237.05209: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 26264 1727204237.05221: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 26264 1727204237.05261: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a5724f0> <<< 26264 1727204237.05381: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a84af70> <<< 26264 1727204237.05426: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a844ac0> <<< 26264 1727204237.05429: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a844490> <<< 26264 1727204237.05469: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 26264 1727204237.05472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 26264 1727204237.05514: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 26264 1727204237.05521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 26264 1727204237.05553: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 26264 1727204237.05556: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a4a6250> <<< 26264 1727204237.05587: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a55d550> <<< 26264 1727204237.05642: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a844f40> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8bb0d0> <<< 26264 1727204237.05677: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 26264 1727204237.05696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 26264 1727204237.05715: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 26264 1727204237.05743: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a4b8b80> import 'errno' # <<< 26264 1727204237.05780: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a4b8eb0> <<< 26264 1727204237.05800: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 26264 1727204237.05820: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 26264 1727204237.05838: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 26264 1727204237.05859: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a4c97c0> <<< 26264 1727204237.05875: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 26264 1727204237.05906: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 26264 1727204237.05932: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a4c9d00> <<< 26264 1727204237.05980: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a458430> <<< 26264 1727204237.05983: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a4b8fa0> <<< 26264 1727204237.06000: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 26264 1727204237.06011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 26264 1727204237.06075: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204237.06078: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a468310> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a4c9640> <<< 26264 1727204237.06106: stdout chunk (state=3): >>>import 'pwd' # <<< 26264 1727204237.06109: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204237.06111: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a4683d0> <<< 26264 1727204237.06149: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a842a60> <<< 26264 1727204237.06171: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 26264 1727204237.06194: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 26264 1727204237.06215: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 26264 1727204237.06218: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 26264 1727204237.06259: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a484730> <<< 26264 1727204237.06274: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py <<< 26264 1727204237.06294: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 26264 1727204237.06311: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a484a00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a4847f0> <<< 26264 1727204237.06340: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a4848e0> <<< 26264 1727204237.06372: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 26264 1727204237.06578: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a484d30> <<< 26264 1727204237.06626: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204237.06630: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a48e280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a484970> <<< 26264 1727204237.06654: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a477ac0> <<< 26264 1727204237.06673: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a842640> <<< 26264 1727204237.06686: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 26264 1727204237.06757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 26264 1727204237.06786: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a484b20> <<< 26264 1727204237.06939: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 26264 1727204237.06952: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f207a39e700> <<< 26264 1727204237.07252: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 26264 1727204237.07346: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.07379: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 26264 1727204237.07414: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.07417: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 26264 1727204237.07430: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.08669: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.10013: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079dae850> <<< 26264 1727204237.10045: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204237.10088: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 26264 1727204237.10104: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 26264 1727204237.10117: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079dae160> <<< 26264 1727204237.10174: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079dae280> <<< 26264 1727204237.10206: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079daefa0> <<< 26264 1727204237.10233: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 26264 1727204237.10302: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079dae4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079daedc0> import 'atexit' # <<< 26264 1727204237.10340: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079dae580> <<< 26264 1727204237.10355: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 26264 1727204237.10381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 26264 1727204237.10436: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079dae100> <<< 26264 1727204237.10477: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 26264 1727204237.10488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 26264 1727204237.10507: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 26264 1727204237.10739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d830a0> <<< 26264 1727204237.10787: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079c88370> <<< 26264 1727204237.10851: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204237.10866: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079c88070> <<< 26264 1727204237.10883: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 26264 1727204237.10907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 26264 1727204237.10975: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079c88cd0> <<< 26264 1727204237.11006: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d96dc0> <<< 26264 1727204237.11310: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d963a0> <<< 26264 1727204237.11372: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 26264 1727204237.11375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 26264 1727204237.11419: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d96f40> <<< 26264 1727204237.11445: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 26264 1727204237.11477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 26264 1727204237.11543: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 26264 1727204237.11584: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 26264 1727204237.11606: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 26264 1727204237.11707: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 26264 1727204237.11736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079de3f40> <<< 26264 1727204237.11825: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079db5d60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079db5430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d61af0> <<< 26264 1727204237.11861: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079db5550> <<< 26264 1727204237.11903: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079db5580> <<< 26264 1727204237.11983: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 26264 1727204237.11999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 26264 1727204237.12054: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 26264 1727204237.12150: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079cf6fa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079df5280> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 26264 1727204237.12163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 26264 1727204237.12215: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079cf3820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079df5400> <<< 26264 1727204237.12242: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 26264 1727204237.12294: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204237.12310: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 26264 1727204237.12392: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079df5c40> <<< 26264 1727204237.12512: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079cf37c0> <<< 26264 1727204237.12604: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079d8e1c0> <<< 26264 1727204237.12670: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079df59d0> <<< 26264 1727204237.12701: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079df5550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079dee940> <<< 26264 1727204237.12741: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 26264 1727204237.12755: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 26264 1727204237.12799: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079ce8910> <<< 26264 1727204237.13015: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079d06dc0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079cf2550> <<< 26264 1727204237.13067: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079ce8eb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079cf2970> # zipimport: zlib available <<< 26264 1727204237.13091: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 26264 1727204237.13094: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.13176: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.13242: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 26264 1727204237.13303: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 26264 1727204237.13308: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.13385: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.13489: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.13961: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.14871: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079d2f7f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d348b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079884940> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available <<< 26264 1727204237.14891: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 26264 1727204237.15039: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.15257: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 26264 1727204237.15274: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 26264 1727204237.15324: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d6c730> <<< 26264 1727204237.15357: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.16042: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.16684: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.16789: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.16901: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py<<< 26264 1727204237.16906: stdout chunk (state=3): >>> <<< 26264 1727204237.16935: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.16988: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.17048: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 26264 1727204237.17075: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.17080: stdout chunk (state=3): >>> <<< 26264 1727204237.17177: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.17296: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py<<< 26264 1727204237.17300: stdout chunk (state=3): >>> <<< 26264 1727204237.17328: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.17332: stdout chunk (state=3): >>> <<< 26264 1727204237.17357: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.17386: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 26264 1727204237.17417: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.17422: stdout chunk (state=3): >>> <<< 26264 1727204237.17481: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.17540: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 26264 1727204237.17571: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.17892: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.18227: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 26264 1727204237.18281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc'<<< 26264 1727204237.18286: stdout chunk (state=3): >>> <<< 26264 1727204237.18313: stdout chunk (state=3): >>>import '_ast' # <<< 26264 1727204237.18317: stdout chunk (state=3): >>> <<< 26264 1727204237.18433: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079db12e0> <<< 26264 1727204237.18468: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.18570: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.18574: stdout chunk (state=3): >>> <<< 26264 1727204237.18671: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 26264 1727204237.18696: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py <<< 26264 1727204237.18727: stdout chunk (state=3): >>>import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 26264 1727204237.18758: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 26264 1727204237.18800: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.18863: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.18928: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 26264 1727204237.18958: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.19024: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.19092: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.19234: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.19343: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 26264 1727204237.19400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204237.19528: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204237.19550: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204237.19566: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079d26880> <<< 26264 1727204237.19737: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079700550><<< 26264 1727204237.19742: stdout chunk (state=3): >>> <<< 26264 1727204237.19791: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py<<< 26264 1727204237.19810: stdout chunk (state=3): >>> <<< 26264 1727204237.19814: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py<<< 26264 1727204237.19817: stdout chunk (state=3): >>> <<< 26264 1727204237.19845: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.19853: stdout chunk (state=3): >>> <<< 26264 1727204237.19934: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.19939: stdout chunk (state=3): >>> <<< 26264 1727204237.20029: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.20036: stdout chunk (state=3): >>> <<< 26264 1727204237.20079: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.20082: stdout chunk (state=3): >>> <<< 26264 1727204237.20150: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 26264 1727204237.20169: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 26264 1727204237.20199: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 26264 1727204237.20252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 26264 1727204237.20281: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 26264 1727204237.20312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 26264 1727204237.20452: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d37910> <<< 26264 1727204237.20513: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d80970> <<< 26264 1727204237.20599: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d6a850> # destroy ansible.module_utils.distro <<< 26264 1727204237.20612: stdout chunk (state=3): >>>import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 26264 1727204237.20649: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.20678: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py <<< 26264 1727204237.20683: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 26264 1727204237.20779: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 26264 1727204237.20800: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.20807: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.20834: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 26264 1727204237.20843: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.20943: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.21006: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.21012: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.21025: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.21063: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.21092: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.21134: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.21171: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 26264 1727204237.21190: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.21247: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.21319: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.21331: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.21376: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 26264 1727204237.21528: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.21673: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.21715: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.21767: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204237.21795: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 26264 1727204237.21799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 26264 1727204237.21831: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 26264 1727204237.21834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 26264 1727204237.21863: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079605c70> <<< 26264 1727204237.21890: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 26264 1727204237.21912: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 26264 1727204237.21960: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 26264 1727204237.21981: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 26264 1727204237.21994: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079865a30> <<< 26264 1727204237.22057: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20798659a0> <<< 26264 1727204237.22113: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20798b1b20> <<< 26264 1727204237.22136: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20798b1550> <<< 26264 1727204237.22168: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20798992e0> <<< 26264 1727204237.22206: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079899970> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 26264 1727204237.22220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 26264 1727204237.22251: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 26264 1727204237.22278: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207984a2b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207984aa00> <<< 26264 1727204237.22310: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 26264 1727204237.22350: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207984a940> <<< 26264 1727204237.22425: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 26264 1727204237.22436: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 26264 1727204237.22459: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20796660d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d223a0> <<< 26264 1727204237.22535: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079899670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 26264 1727204237.22539: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 26264 1727204237.22542: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 26264 1727204237.22544: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.22587: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.23133: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available <<< 26264 1727204237.23187: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.23264: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.23333: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py <<< 26264 1727204237.23336: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 26264 1727204237.24009: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.24602: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 26264 1727204237.24608: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.24681: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.24742: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.24784: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.24827: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py <<< 26264 1727204237.24830: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 26264 1727204237.24834: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.24877: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.24909: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 26264 1727204237.24917: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.24973: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.25043: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 26264 1727204237.25048: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.25082: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.25119: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 26264 1727204237.25127: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.25156: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.25191: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 26264 1727204237.25201: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.25291: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.25395: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py <<< 26264 1727204237.25400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 26264 1727204237.25437: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079555eb0> <<< 26264 1727204237.25462: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 26264 1727204237.25492: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 26264 1727204237.25732: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20795559d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 26264 1727204237.25738: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.25795: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.26008: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 26264 1727204237.26110: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.26207: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 26264 1727204237.26304: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 26264 1727204237.26383: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 26264 1727204237.26477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 26264 1727204237.26509: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204237.26515: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20795c1bb0> <<< 26264 1727204237.26819: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207957ca60> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 26264 1727204237.26822: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.26865: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.26909: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 26264 1727204237.26920: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.26987: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.27060: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.27147: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.27292: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 26264 1727204237.27322: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.27369: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 26264 1727204237.27373: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.27398: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.27442: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 26264 1727204237.27528: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20795c8040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20795c86d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available <<< 26264 1727204237.27544: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 26264 1727204237.27580: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.27614: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 26264 1727204237.27620: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.28027: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 26264 1727204237.28171: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.28291: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 26264 1727204237.28326: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 26264 1727204237.28331: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 26264 1727204237.28468: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.28489: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.29032: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 26264 1727204237.29083: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.29089: stdout chunk (state=3): >>> <<< 26264 1727204237.29266: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py<<< 26264 1727204237.29272: stdout chunk (state=3): >>> <<< 26264 1727204237.29296: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.29301: stdout chunk (state=3): >>> <<< 26264 1727204237.29359: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.29366: stdout chunk (state=3): >>> <<< 26264 1727204237.29418: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.29427: stdout chunk (state=3): >>> <<< 26264 1727204237.30200: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.30207: stdout chunk (state=3): >>> <<< 26264 1727204237.30761: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 26264 1727204237.30773: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.30872: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.30971: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 26264 1727204237.30983: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.31059: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.31145: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 26264 1727204237.31695: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available <<< 26264 1727204237.31748: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.31874: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.32171: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.32843: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 26264 1727204237.32912: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.33000: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 26264 1727204237.33030: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.33033: stdout chunk (state=3): >>> <<< 26264 1727204237.33101: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.33194: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 26264 1727204237.33208: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.33577: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.33583: stdout chunk (state=3): >>> <<< 26264 1727204237.33938: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 26264 1727204237.33962: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.33969: stdout chunk (state=3): >>> <<< 26264 1727204237.34043: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.34050: stdout chunk (state=3): >>> <<< 26264 1727204237.34123: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py<<< 26264 1727204237.34131: stdout chunk (state=3): >>> <<< 26264 1727204237.34152: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.34196: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.34204: stdout chunk (state=3): >>> <<< 26264 1727204237.34246: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py<<< 26264 1727204237.34270: stdout chunk (state=3): >>> <<< 26264 1727204237.34273: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.34278: stdout chunk (state=3): >>> <<< 26264 1727204237.34318: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.34324: stdout chunk (state=3): >>> <<< 26264 1727204237.34365: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py<<< 26264 1727204237.34382: stdout chunk (state=3): >>> <<< 26264 1727204237.34392: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.34440: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.34447: stdout chunk (state=3): >>> <<< 26264 1727204237.34488: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py<<< 26264 1727204237.34493: stdout chunk (state=3): >>> <<< 26264 1727204237.34518: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.34524: stdout chunk (state=3): >>> <<< 26264 1727204237.34619: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.34626: stdout chunk (state=3): >>> <<< 26264 1727204237.34724: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py<<< 26264 1727204237.34747: stdout chunk (state=3): >>> <<< 26264 1727204237.34750: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.34766: stdout chunk (state=3): >>> <<< 26264 1727204237.34790: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.34794: stdout chunk (state=3): >>> <<< 26264 1727204237.34796: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 26264 1727204237.34822: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.34828: stdout chunk (state=3): >>> <<< 26264 1727204237.34881: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.34886: stdout chunk (state=3): >>> <<< 26264 1727204237.34943: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py<<< 26264 1727204237.34956: stdout chunk (state=3): >>> <<< 26264 1727204237.34972: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.35000: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.35005: stdout chunk (state=3): >>> <<< 26264 1727204237.35036: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.35041: stdout chunk (state=3): >>> <<< 26264 1727204237.35107: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.35178: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.35281: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.35386: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 26264 1727204237.35430: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 26264 1727204237.35447: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.35502: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.35590: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 26264 1727204237.35602: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.35877: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.36144: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 26264 1727204237.36171: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.36183: stdout chunk (state=3): >>> <<< 26264 1727204237.36232: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.36295: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 26264 1727204237.36324: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.36389: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.36454: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 26264 1727204237.36478: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.36490: stdout chunk (state=3): >>> <<< 26264 1727204237.36583: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204237.37079: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 26264 1727204237.37112: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204237.37120: stdout chunk (state=3): >>> <<< 26264 1727204237.38429: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 26264 1727204237.38457: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 26264 1727204237.38485: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 26264 1727204237.38524: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207954a310> <<< 26264 1727204237.38546: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207954a460> <<< 26264 1727204237.38633: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20794f8910> <<< 26264 1727204237.39120: stdout chunk (state=3): >>>import 'gc' # <<< 26264 1727204237.47510: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 26264 1727204237.47550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 26264 1727204237.47570: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207954adf0> <<< 26264 1727204237.47591: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 26264 1727204237.47610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 26264 1727204237.47642: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207950d4f0> <<< 26264 1727204237.47706: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py <<< 26264 1727204237.47712: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204237.47753: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py <<< 26264 1727204237.47762: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207935deb0> <<< 26264 1727204237.47774: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207935d610> <<< 26264 1727204237.48277: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 26264 1727204237.68445: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39"<<< 26264 1727204237.68468: stdout chunk (state=3): >>>, "day": "24", "hour": "14", "minute": "57", "second": "17", "epoch": "1727204237", "epoch_int": "1727204237", "date": "2024-09-24", "time": "14:57:17", "iso8601_micro": "2024-09-24T18:57:17.396572Z", "iso8601": "2024-09-24T18:57:17Z", "iso8601_basic": "20240924T145717396572", "iso8601_basic_short": "20240924T145717", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_o<<< 26264 1727204237.68494: stdout chunk (state=3): >>>ffload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2810, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 722, "free": 2810}, "nocache": {"free": 3269, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host":<<< 26264 1727204237.68497: stdout chunk (state=3): >>> "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 583, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264279797760, "block_size": 4096, "block_total": 65519355, "block_available": 64521435, "block_used": 997920, "inode_total": 131071472, "inode_available": 130998308, "inode_used": 73164, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.33, "5m": 0.34, "15m": 0.18}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 26264 1727204237.69000: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1<<< 26264 1727204237.69035: stdout chunk (state=3): >>> # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins <<< 26264 1727204237.69068: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs <<< 26264 1727204237.69094: stdout chunk (state=3): >>># cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat <<< 26264 1727204237.69128: stdout chunk (state=3): >>># cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale<<< 26264 1727204237.69204: stdout chunk (state=3): >>> # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq <<< 26264 1727204237.69231: stdout chunk (state=3): >>># cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator <<< 26264 1727204237.69269: stdout chunk (state=3): >>># cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap <<< 26264 1727204237.69296: stdout chunk (state=3): >>># cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading <<< 26264 1727204237.69311: stdout chunk (state=3): >>># cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 <<< 26264 1727204237.69388: stdout chunk (state=3): >>># cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile <<< 26264 1727204237.69392: stdout chunk (state=3): >>># destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl<<< 26264 1727204237.69395: stdout chunk (state=3): >>> # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token <<< 26264 1727204237.69398: stdout chunk (state=3): >>># cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string <<< 26264 1727204237.69400: stdout chunk (state=3): >>># cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon <<< 26264 1727204237.69402: stdout chunk (state=3): >>># cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves <<< 26264 1727204237.69404: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy <<< 26264 1727204237.69406: stdout chunk (state=3): >>># destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing<<< 26264 1727204237.69408: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters <<< 26264 1727204237.69409: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext<<< 26264 1727204237.69411: stdout chunk (state=3): >>> # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info <<< 26264 1727204237.69413: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 26264 1727204237.69415: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other<<< 26264 1727204237.69420: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime <<< 26264 1727204237.69422: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr <<< 26264 1727204237.69424: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base <<< 26264 1727204237.69426: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux <<< 26264 1727204237.69427: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd <<< 26264 1727204237.69429: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors <<< 26264 1727204237.69431: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor <<< 26264 1727204237.69433: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware <<< 26264 1727204237.69435: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux <<< 26264 1727204237.69437: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd <<< 26264 1727204237.69443: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 26264 1727204237.69680: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 26264 1727204237.69705: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 26264 1727204237.69739: stdout chunk (state=3): >>># destroy zipimport <<< 26264 1727204237.69759: stdout chunk (state=3): >>># destroy _compression <<< 26264 1727204237.69802: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 26264 1727204237.69807: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal <<< 26264 1727204237.69810: stdout chunk (state=3): >>># destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 26264 1727204237.69812: stdout chunk (state=3): >>># destroy _json # destroy encodings <<< 26264 1727204237.69841: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 26264 1727204237.69885: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 26264 1727204237.69939: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 26264 1727204237.69942: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle <<< 26264 1727204237.69945: stdout chunk (state=3): >>># destroy _compat_pickle <<< 26264 1727204237.69972: stdout chunk (state=3): >>># destroy queue <<< 26264 1727204237.69982: stdout chunk (state=3): >>># destroy multiprocessing.reduction <<< 26264 1727204237.69987: stdout chunk (state=3): >>># destroy shlex <<< 26264 1727204237.70011: stdout chunk (state=3): >>># destroy datetime <<< 26264 1727204237.70014: stdout chunk (state=3): >>># destroy base64 <<< 26264 1727204237.70043: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 26264 1727204237.70051: stdout chunk (state=3): >>># destroy json <<< 26264 1727204237.70100: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 26264 1727204237.70104: stdout chunk (state=3): >>># destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection <<< 26264 1727204237.70107: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util <<< 26264 1727204237.70109: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 26264 1727204237.70176: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 26264 1727204237.70198: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc<<< 26264 1727204237.70258: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 26264 1727204237.70275: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 <<< 26264 1727204237.70306: stdout chunk (state=3): >>># cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 26264 1727204237.70396: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select <<< 26264 1727204237.70416: stdout chunk (state=3): >>># cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 <<< 26264 1727204237.70430: stdout chunk (state=3): >>># cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil <<< 26264 1727204237.70445: stdout chunk (state=3): >>># destroy fnmatch <<< 26264 1727204237.70448: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno <<< 26264 1727204237.70450: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 26264 1727204237.70453: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap <<< 26264 1727204237.70455: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile <<< 26264 1727204237.70457: stdout chunk (state=3): >>># destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools <<< 26264 1727204237.70459: stdout chunk (state=3): >>># cleanup[3] wiping collections # destroy _collections_abc # destroy heapq <<< 26264 1727204237.70461: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 26264 1727204237.70509: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types <<< 26264 1727204237.70512: stdout chunk (state=3): >>># cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 26264 1727204237.70514: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath<<< 26264 1727204237.70521: stdout chunk (state=3): >>> # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc <<< 26264 1727204237.70524: stdout chunk (state=3): >>># cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases <<< 26264 1727204237.70527: stdout chunk (state=3): >>># cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 26264 1727204237.70541: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 26264 1727204237.70698: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 26264 1727204237.70779: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat <<< 26264 1727204237.70800: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 26264 1727204237.70813: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 26264 1727204237.70870: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 26264 1727204237.71288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204237.71292: stdout chunk (state=3): >>><<< 26264 1727204237.71294: stderr chunk (state=3): >>><<< 26264 1727204237.71489: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a9b3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a9583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a9b3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a9b3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a958490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a958940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a958670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a90f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a90f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a932850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a90f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a970880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a908d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a932d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a958970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8aff10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8b40a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8a75b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8ae6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8af3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a831e50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a831940> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a831f40> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a831d90> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a842100> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a889dc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8826a0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a895700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8b5eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a842d00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8892e0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a895310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8bba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a842ee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a842e20> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a842d90> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a572400> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a5724f0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a84af70> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a844ac0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a844490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a4a6250> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a55d550> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a844f40> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a8bb0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a4b8b80> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a4b8eb0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a4c97c0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a4c9d00> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a458430> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a4b8fa0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a468310> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a4c9640> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a4683d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a842a60> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a484730> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a484a00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a4847f0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a4848e0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a484d30> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207a48e280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a484970> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a477ac0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a842640> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207a484b20> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f207a39e700> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079dae850> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079dae160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079dae280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079daefa0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079dae4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079daedc0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079dae580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079dae100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d830a0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079c88370> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079c88070> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079c88cd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d96dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d963a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d96f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079de3f40> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079db5d60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079db5430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d61af0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079db5550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079db5580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079cf6fa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079df5280> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079cf3820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079df5400> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079df5c40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079cf37c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079d8e1c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079df59d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079df5550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079dee940> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079ce8910> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079d06dc0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079cf2550> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079ce8eb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079cf2970> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079d2f7f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d348b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079884940> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d6c730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079db12e0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2079d26880> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079700550> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d37910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d80970> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d6a850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079605c70> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079865a30> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20798659a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20798b1b20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20798b1550> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20798992e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079899970> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207984a2b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207984aa00> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207984a940> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20796660d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079d223a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079899670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2079555eb0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20795559d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20795c1bb0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207957ca60> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20795c8040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20795c86d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_35acg6d3/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f207954a310> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207954a460> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20794f8910> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207954adf0> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207950d4f0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207935deb0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f207935d610> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "17", "epoch": "1727204237", "epoch_int": "1727204237", "date": "2024-09-24", "time": "14:57:17", "iso8601_micro": "2024-09-24T18:57:17.396572Z", "iso8601": "2024-09-24T18:57:17Z", "iso8601_basic": "20240924T145717396572", "iso8601_basic_short": "20240924T145717", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2810, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 722, "free": 2810}, "nocache": {"free": 3269, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 583, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264279797760, "block_size": 4096, "block_total": 65519355, "block_available": 64521435, "block_used": 997920, "inode_total": 131071472, "inode_available": 130998308, "inode_used": 73164, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.33, "5m": 0.34, "15m": 0.18}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 26264 1727204237.72933: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204236.236435-26352-270796511036158/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204237.72936: _low_level_execute_command(): starting 26264 1727204237.72939: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204236.236435-26352-270796511036158/ > /dev/null 2>&1 && sleep 0' 26264 1727204237.73522: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204237.73526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204237.73573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204237.73581: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204237.73604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204237.73607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204237.73674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204237.73677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204237.73679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204237.73717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204237.76127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204237.76727: stderr chunk (state=3): >>><<< 26264 1727204237.76731: stdout chunk (state=3): >>><<< 26264 1727204237.76734: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204237.76736: handler run complete 26264 1727204237.76739: variable 'ansible_facts' from source: unknown 26264 1727204237.76741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204237.77093: variable 'ansible_facts' from source: unknown 26264 1727204237.77097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204237.77112: attempt loop complete, returning result 26264 1727204237.77116: _execute() done 26264 1727204237.77118: dumping result to json 26264 1727204237.77148: done dumping result, returning 26264 1727204237.77158: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-5ff5-08b0-00000000007c] 26264 1727204237.77163: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000007c ok: [managed-node3] 26264 1727204237.77997: no more pending results, returning what we have 26264 1727204237.78000: results queue empty 26264 1727204237.78001: checking for any_errors_fatal 26264 1727204237.78002: done checking for any_errors_fatal 26264 1727204237.78003: checking for max_fail_percentage 26264 1727204237.78005: done checking for max_fail_percentage 26264 1727204237.78006: checking to see if all hosts have failed and the running result is not ok 26264 1727204237.78006: done checking to see if all hosts have failed 26264 1727204237.78007: getting the remaining hosts for this loop 26264 1727204237.78009: done getting the remaining hosts for this loop 26264 1727204237.78013: getting the next task for host managed-node3 26264 1727204237.78019: done getting next task for host managed-node3 26264 1727204237.78021: ^ task is: TASK: meta (flush_handlers) 26264 1727204237.78023: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204237.78027: getting variables 26264 1727204237.78028: in VariableManager get_vars() 26264 1727204237.78051: Calling all_inventory to load vars for managed-node3 26264 1727204237.78054: Calling groups_inventory to load vars for managed-node3 26264 1727204237.78057: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204237.78069: Calling all_plugins_play to load vars for managed-node3 26264 1727204237.78072: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204237.78084: Calling groups_plugins_play to load vars for managed-node3 26264 1727204237.78284: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000007c 26264 1727204237.78288: WORKER PROCESS EXITING 26264 1727204237.78319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204237.78489: done with get_vars() 26264 1727204237.78498: done getting variables 26264 1727204237.78559: in VariableManager get_vars() 26264 1727204237.78569: Calling all_inventory to load vars for managed-node3 26264 1727204237.78571: Calling groups_inventory to load vars for managed-node3 26264 1727204237.78573: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204237.78576: Calling all_plugins_play to load vars for managed-node3 26264 1727204237.78577: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204237.78579: Calling groups_plugins_play to load vars for managed-node3 26264 1727204237.78676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204237.78785: done with get_vars() 26264 1727204237.78796: done queuing things up, now waiting for results queue to drain 26264 1727204237.78797: results queue empty 26264 1727204237.78798: checking for any_errors_fatal 26264 1727204237.78799: done checking for any_errors_fatal 26264 1727204237.78800: checking for max_fail_percentage 26264 1727204237.78804: done checking for max_fail_percentage 26264 1727204237.78805: checking to see if all hosts have failed and the running result is not ok 26264 1727204237.78805: done checking to see if all hosts have failed 26264 1727204237.78806: getting the remaining hosts for this loop 26264 1727204237.78806: done getting the remaining hosts for this loop 26264 1727204237.78808: getting the next task for host managed-node3 26264 1727204237.78811: done getting next task for host managed-node3 26264 1727204237.78813: ^ task is: TASK: Include the task 'el_repo_setup.yml' 26264 1727204237.78814: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204237.78815: getting variables 26264 1727204237.78816: in VariableManager get_vars() 26264 1727204237.78821: Calling all_inventory to load vars for managed-node3 26264 1727204237.78823: Calling groups_inventory to load vars for managed-node3 26264 1727204237.78824: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204237.78828: Calling all_plugins_play to load vars for managed-node3 26264 1727204237.78830: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204237.78831: Calling groups_plugins_play to load vars for managed-node3 26264 1727204237.78915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204237.79022: done with get_vars() 26264 1727204237.79028: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:11 Tuesday 24 September 2024 14:57:17 -0400 (0:00:01.628) 0:00:01.640 ***** 26264 1727204237.79087: entering _queue_task() for managed-node3/include_tasks 26264 1727204237.79089: Creating lock for include_tasks 26264 1727204237.79310: worker is 1 (out of 1 available) 26264 1727204237.79324: exiting _queue_task() for managed-node3/include_tasks 26264 1727204237.79337: done queuing things up, now waiting for results queue to drain 26264 1727204237.79339: waiting for pending results... 26264 1727204237.79491: running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' 26264 1727204237.79560: in run() - task 0affcd87-79f5-5ff5-08b0-000000000006 26264 1727204237.79572: variable 'ansible_search_path' from source: unknown 26264 1727204237.79600: calling self._execute() 26264 1727204237.79657: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204237.79661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204237.79671: variable 'omit' from source: magic vars 26264 1727204237.79750: _execute() done 26264 1727204237.79753: dumping result to json 26264 1727204237.79757: done dumping result, returning 26264 1727204237.79767: done running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' [0affcd87-79f5-5ff5-08b0-000000000006] 26264 1727204237.79772: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000006 26264 1727204237.79860: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000006 26264 1727204237.79865: WORKER PROCESS EXITING 26264 1727204237.79928: no more pending results, returning what we have 26264 1727204237.79933: in VariableManager get_vars() 26264 1727204237.79965: Calling all_inventory to load vars for managed-node3 26264 1727204237.79969: Calling groups_inventory to load vars for managed-node3 26264 1727204237.79972: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204237.79987: Calling all_plugins_play to load vars for managed-node3 26264 1727204237.79989: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204237.79992: Calling groups_plugins_play to load vars for managed-node3 26264 1727204237.80148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204237.80351: done with get_vars() 26264 1727204237.80359: variable 'ansible_search_path' from source: unknown 26264 1727204237.80374: we have included files to process 26264 1727204237.80375: generating all_blocks data 26264 1727204237.80377: done generating all_blocks data 26264 1727204237.80377: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 26264 1727204237.80379: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 26264 1727204237.80381: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 26264 1727204237.81090: in VariableManager get_vars() 26264 1727204237.81114: done with get_vars() 26264 1727204237.81128: done processing included file 26264 1727204237.81130: iterating over new_blocks loaded from include file 26264 1727204237.81131: in VariableManager get_vars() 26264 1727204237.81143: done with get_vars() 26264 1727204237.81145: filtering new block on tags 26264 1727204237.81156: done filtering new block on tags 26264 1727204237.81160: in VariableManager get_vars() 26264 1727204237.81195: done with get_vars() 26264 1727204237.81197: filtering new block on tags 26264 1727204237.81215: done filtering new block on tags 26264 1727204237.81217: in VariableManager get_vars() 26264 1727204237.81224: done with get_vars() 26264 1727204237.81225: filtering new block on tags 26264 1727204237.81234: done filtering new block on tags 26264 1727204237.81235: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node3 26264 1727204237.81240: extending task lists for all hosts with included blocks 26264 1727204237.81275: done extending task lists 26264 1727204237.81276: done processing included files 26264 1727204237.81277: results queue empty 26264 1727204237.81277: checking for any_errors_fatal 26264 1727204237.81278: done checking for any_errors_fatal 26264 1727204237.81279: checking for max_fail_percentage 26264 1727204237.81279: done checking for max_fail_percentage 26264 1727204237.81280: checking to see if all hosts have failed and the running result is not ok 26264 1727204237.81281: done checking to see if all hosts have failed 26264 1727204237.81281: getting the remaining hosts for this loop 26264 1727204237.81282: done getting the remaining hosts for this loop 26264 1727204237.81283: getting the next task for host managed-node3 26264 1727204237.81286: done getting next task for host managed-node3 26264 1727204237.81287: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 26264 1727204237.81289: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204237.81290: getting variables 26264 1727204237.81291: in VariableManager get_vars() 26264 1727204237.81296: Calling all_inventory to load vars for managed-node3 26264 1727204237.81298: Calling groups_inventory to load vars for managed-node3 26264 1727204237.81299: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204237.81304: Calling all_plugins_play to load vars for managed-node3 26264 1727204237.81306: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204237.81307: Calling groups_plugins_play to load vars for managed-node3 26264 1727204237.81399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204237.81512: done with get_vars() 26264 1727204237.81519: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:57:17 -0400 (0:00:00.024) 0:00:01.664 ***** 26264 1727204237.81573: entering _queue_task() for managed-node3/setup 26264 1727204237.81792: worker is 1 (out of 1 available) 26264 1727204237.81804: exiting _queue_task() for managed-node3/setup 26264 1727204237.81817: done queuing things up, now waiting for results queue to drain 26264 1727204237.81819: waiting for pending results... 26264 1727204237.81973: running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 26264 1727204237.82037: in run() - task 0affcd87-79f5-5ff5-08b0-00000000008d 26264 1727204237.82045: variable 'ansible_search_path' from source: unknown 26264 1727204237.82051: variable 'ansible_search_path' from source: unknown 26264 1727204237.82089: calling self._execute() 26264 1727204237.82139: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204237.82143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204237.82153: variable 'omit' from source: magic vars 26264 1727204237.82526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204237.84181: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204237.84224: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204237.84252: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204237.84296: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204237.84315: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204237.84383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204237.84402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204237.84420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204237.84449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204237.84466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204237.84586: variable 'ansible_facts' from source: unknown 26264 1727204237.84623: variable 'network_test_required_facts' from source: task vars 26264 1727204237.84651: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 26264 1727204237.84657: variable 'omit' from source: magic vars 26264 1727204237.84689: variable 'omit' from source: magic vars 26264 1727204237.84711: variable 'omit' from source: magic vars 26264 1727204237.84733: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204237.84755: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204237.84781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204237.84788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204237.84797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204237.84820: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204237.84823: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204237.84825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204237.84895: Set connection var ansible_pipelining to False 26264 1727204237.84899: Set connection var ansible_connection to ssh 26264 1727204237.84902: Set connection var ansible_shell_type to sh 26264 1727204237.84905: Set connection var ansible_shell_executable to /bin/sh 26264 1727204237.84914: Set connection var ansible_timeout to 10 26264 1727204237.84919: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204237.84940: variable 'ansible_shell_executable' from source: unknown 26264 1727204237.84943: variable 'ansible_connection' from source: unknown 26264 1727204237.84946: variable 'ansible_module_compression' from source: unknown 26264 1727204237.84951: variable 'ansible_shell_type' from source: unknown 26264 1727204237.84954: variable 'ansible_shell_executable' from source: unknown 26264 1727204237.84956: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204237.84958: variable 'ansible_pipelining' from source: unknown 26264 1727204237.84960: variable 'ansible_timeout' from source: unknown 26264 1727204237.84962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204237.85066: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204237.85073: variable 'omit' from source: magic vars 26264 1727204237.85078: starting attempt loop 26264 1727204237.85080: running the handler 26264 1727204237.85092: _low_level_execute_command(): starting 26264 1727204237.85098: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204237.85628: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204237.85650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204237.85668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204237.85686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204237.85726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204237.85740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204237.85807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204237.88068: stdout chunk (state=3): >>>/root <<< 26264 1727204237.88205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204237.88273: stderr chunk (state=3): >>><<< 26264 1727204237.88277: stdout chunk (state=3): >>><<< 26264 1727204237.88297: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204237.88307: _low_level_execute_command(): starting 26264 1727204237.88314: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204237.8829725-26408-1537627181854 `" && echo ansible-tmp-1727204237.8829725-26408-1537627181854="` echo /root/.ansible/tmp/ansible-tmp-1727204237.8829725-26408-1537627181854 `" ) && sleep 0' 26264 1727204237.88795: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204237.88808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204237.88830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204237.88842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204237.88855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204237.88899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204237.88911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204237.88968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204237.91549: stdout chunk (state=3): >>>ansible-tmp-1727204237.8829725-26408-1537627181854=/root/.ansible/tmp/ansible-tmp-1727204237.8829725-26408-1537627181854 <<< 26264 1727204237.91709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204237.91772: stderr chunk (state=3): >>><<< 26264 1727204237.91778: stdout chunk (state=3): >>><<< 26264 1727204237.91796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204237.8829725-26408-1537627181854=/root/.ansible/tmp/ansible-tmp-1727204237.8829725-26408-1537627181854 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204237.91839: variable 'ansible_module_compression' from source: unknown 26264 1727204237.91888: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 26264 1727204237.91936: variable 'ansible_facts' from source: unknown 26264 1727204237.92058: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204237.8829725-26408-1537627181854/AnsiballZ_setup.py 26264 1727204237.92178: Sending initial data 26264 1727204237.92188: Sent initial data (152 bytes) 26264 1727204237.92889: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204237.92902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204237.92922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204237.92934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204237.92954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204237.92993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204237.93005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204237.93072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204237.95495: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204237.95537: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204237.95580: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpryemqhh5 /root/.ansible/tmp/ansible-tmp-1727204237.8829725-26408-1537627181854/AnsiballZ_setup.py <<< 26264 1727204237.95620: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204237.97377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204237.97501: stderr chunk (state=3): >>><<< 26264 1727204237.97506: stdout chunk (state=3): >>><<< 26264 1727204237.97526: done transferring module to remote 26264 1727204237.97538: _low_level_execute_command(): starting 26264 1727204237.97543: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204237.8829725-26408-1537627181854/ /root/.ansible/tmp/ansible-tmp-1727204237.8829725-26408-1537627181854/AnsiballZ_setup.py && sleep 0' 26264 1727204237.98024: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204237.98045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204237.98058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204237.98072: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204237.98128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204237.98147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204237.98192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204238.00860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204238.00904: stderr chunk (state=3): >>><<< 26264 1727204238.00907: stdout chunk (state=3): >>><<< 26264 1727204238.00921: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204238.00928: _low_level_execute_command(): starting 26264 1727204238.00934: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204237.8829725-26408-1537627181854/AnsiballZ_setup.py && sleep 0' 26264 1727204238.01573: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204238.01592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204238.01612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204238.01631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204238.01676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204238.01693: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204238.01707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204238.01728: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204238.01740: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204238.01752: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204238.01766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204238.01780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204238.01797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204238.01811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204238.01821: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204238.01839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204238.01921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204238.01948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204238.01969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204238.02174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204238.04816: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 26264 1727204238.04830: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 26264 1727204238.04900: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 26264 1727204238.04935: stdout chunk (state=3): >>>import 'posix' # <<< 26264 1727204238.04967: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 26264 1727204238.05086: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 26264 1727204238.05149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e98dc0> <<< 26264 1727204238.05190: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 26264 1727204238.05202: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e3d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e98b20> <<< 26264 1727204238.05230: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 26264 1727204238.05416: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e98ac0> import '_signal' # <<< 26264 1727204238.05420: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e3d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e3d940> <<< 26264 1727204238.05426: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e3d670> <<< 26264 1727204238.05452: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 26264 1727204238.05484: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 26264 1727204238.05530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 26264 1727204238.05533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 26264 1727204238.05571: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1bcf190> <<< 26264 1727204238.05602: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 26264 1727204238.05605: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 26264 1727204238.05673: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1bcf220> <<< 26264 1727204238.05699: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 26264 1727204238.05746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1bf2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1bcf940> <<< 26264 1727204238.05785: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e55880> <<< 26264 1727204238.05813: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 26264 1727204238.05817: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1bc8d90> <<< 26264 1727204238.05874: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1bf2d90> <<< 26264 1727204238.05931: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e3d970> <<< 26264 1727204238.05962: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 26264 1727204238.06284: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 26264 1727204238.06313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 26264 1727204238.06342: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 26264 1727204238.06367: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 26264 1727204238.06376: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 26264 1727204238.06380: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 26264 1727204238.06387: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 26264 1727204238.06410: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 26264 1727204238.06413: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b6df10> <<< 26264 1727204238.06470: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b740a0> <<< 26264 1727204238.06479: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 26264 1727204238.06496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 26264 1727204238.06509: stdout chunk (state=3): >>>import '_sre' # <<< 26264 1727204238.06539: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 26264 1727204238.06545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 26264 1727204238.06571: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 26264 1727204238.06600: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b675b0> <<< 26264 1727204238.06633: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b6e6a0> <<< 26264 1727204238.06638: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b6d3d0> <<< 26264 1727204238.06649: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 26264 1727204238.06718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 26264 1727204238.06737: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 26264 1727204238.06787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204238.06801: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 26264 1727204238.06803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 26264 1727204238.06840: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1a55eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a559a0> <<< 26264 1727204238.06868: stdout chunk (state=3): >>>import 'itertools' # <<< 26264 1727204238.06883: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a55fa0> <<< 26264 1727204238.06916: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 26264 1727204238.06930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 26264 1727204238.06956: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a55df0> <<< 26264 1727204238.06989: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a65160> <<< 26264 1727204238.06995: stdout chunk (state=3): >>>import '_collections' # <<< 26264 1727204238.07042: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b49e20> import '_functools' # <<< 26264 1727204238.07075: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b41700> <<< 26264 1727204238.07155: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b55760> <<< 26264 1727204238.07164: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b75eb0> <<< 26264 1727204238.07170: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 26264 1727204238.07202: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1a65d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b49340> <<< 26264 1727204238.07256: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1b55370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b7ba60> <<< 26264 1727204238.07283: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 26264 1727204238.07312: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204238.07342: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 26264 1727204238.07350: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 26264 1727204238.07366: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a65f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a65e80> <<< 26264 1727204238.07399: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a65df0> <<< 26264 1727204238.07417: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 26264 1727204238.07433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 26264 1727204238.07449: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 26264 1727204238.07463: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 26264 1727204238.07482: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 26264 1727204238.07540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 26264 1727204238.07565: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a39460> <<< 26264 1727204238.07580: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 26264 1727204238.07605: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 26264 1727204238.07634: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a39550> <<< 26264 1727204238.07758: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a170d0> <<< 26264 1727204238.07792: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a68b20> <<< 26264 1727204238.07809: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a684c0> <<< 26264 1727204238.07826: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 26264 1727204238.07841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 26264 1727204238.07874: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 26264 1727204238.07895: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 26264 1727204238.07902: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 26264 1727204238.07924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c196d2b0> <<< 26264 1727204238.07961: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a24d60> <<< 26264 1727204238.08014: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a68fa0> <<< 26264 1727204238.08023: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b7b0d0> <<< 26264 1727204238.08040: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 26264 1727204238.08070: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 26264 1727204238.08095: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c197dbe0> <<< 26264 1727204238.08112: stdout chunk (state=3): >>>import 'errno' # <<< 26264 1727204238.08151: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c197df10> <<< 26264 1727204238.08171: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 26264 1727204238.08177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 26264 1727204238.08199: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 26264 1727204238.08210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1990820> <<< 26264 1727204238.08239: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 26264 1727204238.08272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 26264 1727204238.08298: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1990d60> <<< 26264 1727204238.08352: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1929490> <<< 26264 1727204238.08365: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c197df40> <<< 26264 1727204238.08379: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 26264 1727204238.08388: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 26264 1727204238.08433: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1939370> <<< 26264 1727204238.08445: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c19906a0> <<< 26264 1727204238.08453: stdout chunk (state=3): >>>import 'pwd' # <<< 26264 1727204238.08477: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204238.08480: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1939430> <<< 26264 1727204238.08521: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a65ac0> <<< 26264 1727204238.08539: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 26264 1727204238.08552: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 26264 1727204238.08583: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 26264 1727204238.08590: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 26264 1727204238.08628: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1955790> <<< 26264 1727204238.08649: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 26264 1727204238.08685: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1955a60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1955850> <<< 26264 1727204238.08716: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1955940> <<< 26264 1727204238.08747: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 26264 1727204238.08952: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1955d90> <<< 26264 1727204238.08986: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c195f2e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c19559d0> <<< 26264 1727204238.09015: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1949b20> <<< 26264 1727204238.09037: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a656a0> <<< 26264 1727204238.09059: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 26264 1727204238.09117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 26264 1727204238.09157: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1955b80> <<< 26264 1727204238.09299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 26264 1727204238.09322: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe7c188a760> <<< 26264 1727204238.09590: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip' # zipimport: zlib available <<< 26264 1727204238.09684: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.09725: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 26264 1727204238.09744: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.09754: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 26264 1727204238.09770: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.11717: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.13003: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17c78b0> <<< 26264 1727204238.13031: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204238.13053: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 26264 1727204238.13081: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 26264 1727204238.13107: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c17c7160> <<< 26264 1727204238.13156: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17c7280> <<< 26264 1727204238.13190: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17c75e0> <<< 26264 1727204238.13209: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 26264 1727204238.13266: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17c74f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17c7e20> <<< 26264 1727204238.13274: stdout chunk (state=3): >>>import 'atexit' # <<< 26264 1727204238.13304: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c17c7580> <<< 26264 1727204238.13332: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 26264 1727204238.13355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 26264 1727204238.13394: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17c7100> <<< 26264 1727204238.13414: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 26264 1727204238.13426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 26264 1727204238.13455: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 26264 1727204238.13468: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 26264 1727204238.13494: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 26264 1727204238.13586: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c175c040> <<< 26264 1727204238.13620: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c11893d0> <<< 26264 1727204238.13654: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c11890d0> <<< 26264 1727204238.13670: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 26264 1727204238.13684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 26264 1727204238.13724: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1189d30> <<< 26264 1727204238.13738: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17afd90> <<< 26264 1727204238.13903: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17af3a0> <<< 26264 1727204238.13946: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17aff40> <<< 26264 1727204238.13989: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 26264 1727204238.14470: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c188aa90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1785dc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1785490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17c4a90> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c17855b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17855e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c11f4f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c18102e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 26264 1727204238.14637: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c11f17f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1810460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 26264 1727204238.14652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204238.14693: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 26264 1727204238.14724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 26264 1727204238.14829: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1810c40> <<< 26264 1727204238.15057: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c11f1790> <<< 26264 1727204238.15211: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204238.15231: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1810130> <<< 26264 1727204238.15307: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204238.15311: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1810670> <<< 26264 1727204238.15396: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204238.15413: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1810730> <<< 26264 1727204238.15426: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c18089a0> <<< 26264 1727204238.15479: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py <<< 26264 1727204238.15498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 26264 1727204238.15511: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 26264 1727204238.15542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 26264 1727204238.15642: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204238.15646: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c11e78e0> <<< 26264 1727204238.16001: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204238.16020: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c171fc70> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c11f0520> <<< 26264 1727204238.16089: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so'<<< 26264 1727204238.16120: stdout chunk (state=3): >>> import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c11e7e80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c11f0940> <<< 26264 1727204238.16163: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 26264 1727204238.16180: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 26264 1727204238.16208: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.16332: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.16486: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.16502: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 26264 1727204238.16543: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.16593: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 26264 1727204238.16607: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.16767: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.16932: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.17636: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.18122: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 26264 1727204238.18144: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204238.18205: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c171c790> <<< 26264 1727204238.18279: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 26264 1727204238.18296: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1759850> <<< 26264 1727204238.18304: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d91fa0> <<< 26264 1727204238.18363: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 26264 1727204238.18381: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.18385: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 26264 1727204238.18496: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.18625: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 26264 1727204238.18659: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c178d310> # zipimport: zlib available <<< 26264 1727204238.19049: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.19424: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.19470: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.19543: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 26264 1727204238.19573: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.19609: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 26264 1727204238.19612: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.19671: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.19748: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 26264 1727204238.19797: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 26264 1727204238.19801: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.19812: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.19844: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 26264 1727204238.20034: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.20221: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 26264 1727204238.20250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 26264 1727204238.20332: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17cdca0> # zipimport: zlib available <<< 26264 1727204238.20393: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.20479: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 26264 1727204238.20499: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 26264 1727204238.20502: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.20526: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.20571: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 26264 1727204238.20576: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.20599: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.20638: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.20728: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.20801: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 26264 1727204238.20819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204238.20887: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c173dc70> <<< 26264 1727204238.20991: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17cdbb0> <<< 26264 1727204238.21034: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 26264 1727204238.21085: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.21153: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.21169: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.21204: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 26264 1727204238.21223: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 26264 1727204238.21242: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 26264 1727204238.21277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 26264 1727204238.21298: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 26264 1727204238.21312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 26264 1727204238.21387: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c175ad60> <<< 26264 1727204238.21428: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c179bb80> <<< 26264 1727204238.21493: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0bec160> <<< 26264 1727204238.21515: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 26264 1727204238.21548: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py <<< 26264 1727204238.21551: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 26264 1727204238.21628: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 26264 1727204238.21649: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 26264 1727204238.21704: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.21767: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.21799: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.21802: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.21834: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.21866: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.21900: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.21938: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 26264 1727204238.21941: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.21999: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.22081: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.22085: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.22127: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 26264 1727204238.22269: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.22404: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.22440: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.22484: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204238.22534: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 26264 1727204238.22537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 26264 1727204238.22559: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 26264 1727204238.22596: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0aee100> <<< 26264 1727204238.22599: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 26264 1727204238.22620: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 26264 1727204238.22660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 26264 1727204238.22694: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 26264 1727204238.22697: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d54a60> <<< 26264 1727204238.22733: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c0d549d0> <<< 26264 1727204238.22800: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d26c70> <<< 26264 1727204238.22819: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d26c10> <<< 26264 1727204238.22846: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d70460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d703d0> <<< 26264 1727204238.22883: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 26264 1727204238.22911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 26264 1727204238.22944: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c0d36310> <<< 26264 1727204238.22956: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d369a0> <<< 26264 1727204238.22974: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 26264 1727204238.23000: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d36940> <<< 26264 1727204238.23032: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 26264 1727204238.23042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 26264 1727204238.23067: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c0b500d0> <<< 26264 1727204238.23115: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1819c40> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d70790> <<< 26264 1727204238.23160: stdout chunk (state=3): >>>import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 26264 1727204238.23177: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 26264 1727204238.23180: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.23222: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.23281: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 26264 1727204238.23284: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.23319: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.23390: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available <<< 26264 1727204238.23414: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 26264 1727204238.23418: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.23444: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.23456: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 26264 1727204238.23499: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.23553: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 26264 1727204238.23556: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.23584: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.23622: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 26264 1727204238.23687: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.23731: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.23778: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.23834: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py <<< 26264 1727204238.23837: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 26264 1727204238.24237: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.24589: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 26264 1727204238.24636: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.24696: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.24710: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.24763: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 26264 1727204238.24768: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.24780: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.24810: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 26264 1727204238.24861: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.24915: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 26264 1727204238.24928: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.24941: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.24968: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 26264 1727204238.24997: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.25035: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 26264 1727204238.25094: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.25177: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 26264 1727204238.25201: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0a40f10> <<< 26264 1727204238.25223: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 26264 1727204238.25250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 26264 1727204238.25412: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0a409d0> <<< 26264 1727204238.25416: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 26264 1727204238.25473: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.25528: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 26264 1727204238.25602: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.25685: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 26264 1727204238.25746: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.25822: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 26264 1727204238.25829: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.25851: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.25894: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 26264 1727204238.25922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 26264 1727204238.26072: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c0a62c10> <<< 26264 1727204238.26320: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0ab2c40> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 26264 1727204238.26375: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.26430: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 26264 1727204238.26433: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.26487: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.26563: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.26658: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.26800: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 26264 1727204238.26804: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.26832: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.26887: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available <<< 26264 1727204238.26911: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.26960: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 26264 1727204238.27030: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c0ab45e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0ab4790> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 26264 1727204238.27055: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 26264 1727204238.27087: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.27129: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 26264 1727204238.27258: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.27389: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 26264 1727204238.27473: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.27555: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.27585: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.27634: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 26264 1727204238.27637: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 26264 1727204238.27744: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.27748: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.27861: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.27986: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 26264 1727204238.28088: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.28199: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 26264 1727204238.28203: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.28225: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.28258: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.28693: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.29111: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 26264 1727204238.29127: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.29196: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.29288: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 26264 1727204238.29376: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.29462: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 26264 1727204238.29584: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.29725: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 26264 1727204238.29774: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 26264 1727204238.29779: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.29790: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.29826: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available <<< 26264 1727204238.29910: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.29991: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.30160: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.30340: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 26264 1727204238.30344: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.30369: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.30415: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 26264 1727204238.30419: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.30456: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.30460: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 26264 1727204238.30518: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.30590: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 26264 1727204238.30594: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.30619: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.30632: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 26264 1727204238.30683: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.30736: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 26264 1727204238.30739: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.30778: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.30830: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 26264 1727204238.31046: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.31269: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 26264 1727204238.31272: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.31313: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.31367: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 26264 1727204238.31404: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.31447: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 26264 1727204238.31461: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.31474: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.31501: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 26264 1727204238.31535: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.31580: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 26264 1727204238.31583: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.31637: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.31726: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 26264 1727204238.31743: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.31758: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available <<< 26264 1727204238.31783: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.31827: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 26264 1727204238.31859: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.31877: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.31911: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.31951: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.32007: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.32085: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 26264 1727204238.32101: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.32132: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.32179: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 26264 1727204238.32356: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.32526: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 26264 1727204238.32535: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.32559: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.32603: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 26264 1727204238.32651: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.32704: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 26264 1727204238.32768: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.32848: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 26264 1727204238.32921: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.33000: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 26264 1727204238.33079: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204238.33252: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 26264 1727204238.33296: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 26264 1727204238.33300: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 26264 1727204238.33337: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c0854d90> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0826d00> <<< 26264 1727204238.33393: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0826a60> <<< 26264 1727204238.34420: stdout chunk (state=3): >>>import 'gc' # <<< 26264 1727204238.34857: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "18", "epoch": "1727204238", "epoch_int": "1727204238", "date": "2024-09-24", "time": "14:57:18", "iso8601_micro": "2024-09-24T18:57:18.342453Z", "iso8601": "2024-09-24T18:57:18Z", "iso8601_basic": "20240924T145718342453", "iso8601_basic_short": "20240924T145718", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-5<<< 26264 1727204238.34881: stdout chunk (state=3): >>>11.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 26264 1727204238.35486: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external <<< 26264 1727204238.35559: stdout chunk (state=3): >>># cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps <<< 26264 1727204238.35598: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts <<< 26264 1727204238.35645: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb <<< 26264 1727204238.35648: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc <<< 26264 1727204238.35910: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 26264 1727204238.35936: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 26264 1727204238.35970: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 26264 1727204238.36023: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 26264 1727204238.36037: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 26264 1727204238.36067: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 26264 1727204238.36109: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 26264 1727204238.36195: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 26264 1727204238.36213: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 26264 1727204238.36271: stdout chunk (state=3): >>># destroy shlex # destroy datetime # destroy base64 <<< 26264 1727204238.36303: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 26264 1727204238.36319: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 26264 1727204238.36442: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon <<< 26264 1727204238.36508: stdout chunk (state=3): >>># cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch <<< 26264 1727204238.36567: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator <<< 26264 1727204238.36631: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 26264 1727204238.36680: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 26264 1727204238.36716: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket <<< 26264 1727204238.36719: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 26264 1727204238.36895: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 26264 1727204238.36926: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat <<< 26264 1727204238.36973: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 26264 1727204238.36989: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 26264 1727204238.37028: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 26264 1727204238.37332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204238.37427: stderr chunk (state=3): >>><<< 26264 1727204238.37430: stdout chunk (state=3): >>><<< 26264 1727204238.37686: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e98dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e3d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e98b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e98ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e3d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e3d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e3d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1bcf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1bcf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1bf2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1bcf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e55880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1bc8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1bf2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1e3d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b6df10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b740a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b675b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b6e6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b6d3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1a55eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a559a0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a55fa0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a55df0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a65160> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b49e20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b41700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b55760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b75eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1a65d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b49340> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1b55370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b7ba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a65f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a65e80> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a65df0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a39460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a39550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a170d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a68b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a684c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c196d2b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a24d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a68fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1b7b0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c197dbe0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c197df10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1990820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1990d60> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1929490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c197df40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1939370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c19906a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1939430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a65ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1955790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1955a60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1955850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1955940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1955d90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c195f2e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c19559d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1949b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1a656a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1955b80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe7c188a760> # zipimport: found 103 names in '/tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17c78b0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c17c7160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17c7280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17c75e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17c74f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17c7e20> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c17c7580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17c7100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c175c040> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c11893d0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c11890d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1189d30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17afd90> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17af3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17aff40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c188aa90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1785dc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1785490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17c4a90> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c17855b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17855e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c11f4f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c18102e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c11f17f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1810460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1810c40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c11f1790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1810130> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1810670> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c1810730> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c18089a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c11e78e0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c171fc70> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c11f0520> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c11e7e80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c11f0940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c171c790> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1759850> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d91fa0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c178d310> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17cdca0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c173dc70> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c17cdbb0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c175ad60> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c179bb80> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0bec160> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0aee100> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d54a60> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c0d549d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d26c70> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d26c10> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d70460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d703d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c0d36310> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d369a0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d36940> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c0b500d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c1819c40> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0d70790> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0a40f10> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0a409d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c0a62c10> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0ab2c40> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c0ab45e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0ab4790> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_gzu_d1wf/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7c0854d90> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0826d00> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7c0826a60> import 'gc' # {"ansible_facts": {"ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "18", "epoch": "1727204238", "epoch_int": "1727204238", "date": "2024-09-24", "time": "14:57:18", "iso8601_micro": "2024-09-24T18:57:18.342453Z", "iso8601": "2024-09-24T18:57:18Z", "iso8601_basic": "20240924T145718342453", "iso8601_basic_short": "20240924T145718", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 26264 1727204238.39068: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204237.8829725-26408-1537627181854/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204238.39072: _low_level_execute_command(): starting 26264 1727204238.39074: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204237.8829725-26408-1537627181854/ > /dev/null 2>&1 && sleep 0' 26264 1727204238.39524: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204238.39533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204238.39544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204238.39558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204238.39606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204238.39613: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204238.39621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204238.39635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204238.39643: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204238.39652: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204238.39657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204238.39668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204238.39680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204238.39688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204238.39696: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204238.39711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204238.39781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204238.39796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204238.39800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204238.40039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204238.41861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204238.41868: stdout chunk (state=3): >>><<< 26264 1727204238.41874: stderr chunk (state=3): >>><<< 26264 1727204238.41896: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204238.41899: handler run complete 26264 1727204238.41946: variable 'ansible_facts' from source: unknown 26264 1727204238.42000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204238.42115: variable 'ansible_facts' from source: unknown 26264 1727204238.42183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204238.42236: attempt loop complete, returning result 26264 1727204238.42240: _execute() done 26264 1727204238.42242: dumping result to json 26264 1727204238.42254: done dumping result, returning 26264 1727204238.42262: done running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcd87-79f5-5ff5-08b0-00000000008d] 26264 1727204238.42269: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000008d 26264 1727204238.42474: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000008d 26264 1727204238.42477: WORKER PROCESS EXITING ok: [managed-node3] 26264 1727204238.42595: no more pending results, returning what we have 26264 1727204238.42599: results queue empty 26264 1727204238.42600: checking for any_errors_fatal 26264 1727204238.42601: done checking for any_errors_fatal 26264 1727204238.42602: checking for max_fail_percentage 26264 1727204238.42604: done checking for max_fail_percentage 26264 1727204238.42605: checking to see if all hosts have failed and the running result is not ok 26264 1727204238.42606: done checking to see if all hosts have failed 26264 1727204238.42607: getting the remaining hosts for this loop 26264 1727204238.42608: done getting the remaining hosts for this loop 26264 1727204238.42612: getting the next task for host managed-node3 26264 1727204238.42620: done getting next task for host managed-node3 26264 1727204238.42622: ^ task is: TASK: Check if system is ostree 26264 1727204238.42625: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204238.42628: getting variables 26264 1727204238.42630: in VariableManager get_vars() 26264 1727204238.42655: Calling all_inventory to load vars for managed-node3 26264 1727204238.42658: Calling groups_inventory to load vars for managed-node3 26264 1727204238.42661: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204238.42678: Calling all_plugins_play to load vars for managed-node3 26264 1727204238.42681: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204238.42684: Calling groups_plugins_play to load vars for managed-node3 26264 1727204238.42852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204238.43062: done with get_vars() 26264 1727204238.43074: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:57:18 -0400 (0:00:00.617) 0:00:02.282 ***** 26264 1727204238.43343: entering _queue_task() for managed-node3/stat 26264 1727204238.43742: worker is 1 (out of 1 available) 26264 1727204238.43873: exiting _queue_task() for managed-node3/stat 26264 1727204238.43886: done queuing things up, now waiting for results queue to drain 26264 1727204238.43887: waiting for pending results... 26264 1727204238.44450: running TaskExecutor() for managed-node3/TASK: Check if system is ostree 26264 1727204238.44839: in run() - task 0affcd87-79f5-5ff5-08b0-00000000008f 26264 1727204238.44853: variable 'ansible_search_path' from source: unknown 26264 1727204238.44858: variable 'ansible_search_path' from source: unknown 26264 1727204238.44992: calling self._execute() 26264 1727204238.45061: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204238.45066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204238.45078: variable 'omit' from source: magic vars 26264 1727204238.46645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204238.46944: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204238.46998: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204238.47045: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204238.47087: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204238.47183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204238.47211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204238.47253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204238.47286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204238.47414: Evaluated conditional (not __network_is_ostree is defined): True 26264 1727204238.47425: variable 'omit' from source: magic vars 26264 1727204238.47482: variable 'omit' from source: magic vars 26264 1727204238.47520: variable 'omit' from source: magic vars 26264 1727204238.47546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204238.47607: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204238.47630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204238.47654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204238.47672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204238.47710: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204238.47718: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204238.47724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204238.47833: Set connection var ansible_pipelining to False 26264 1727204238.47841: Set connection var ansible_connection to ssh 26264 1727204238.47850: Set connection var ansible_shell_type to sh 26264 1727204238.47861: Set connection var ansible_shell_executable to /bin/sh 26264 1727204238.47874: Set connection var ansible_timeout to 10 26264 1727204238.47885: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204238.47921: variable 'ansible_shell_executable' from source: unknown 26264 1727204238.47928: variable 'ansible_connection' from source: unknown 26264 1727204238.47934: variable 'ansible_module_compression' from source: unknown 26264 1727204238.47940: variable 'ansible_shell_type' from source: unknown 26264 1727204238.47945: variable 'ansible_shell_executable' from source: unknown 26264 1727204238.47954: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204238.47961: variable 'ansible_pipelining' from source: unknown 26264 1727204238.47970: variable 'ansible_timeout' from source: unknown 26264 1727204238.47977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204238.48240: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204238.48258: variable 'omit' from source: magic vars 26264 1727204238.48270: starting attempt loop 26264 1727204238.48276: running the handler 26264 1727204238.48292: _low_level_execute_command(): starting 26264 1727204238.48304: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204238.50038: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204238.50056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204238.50073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204238.50100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204238.50140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204238.50155: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204238.50171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204238.50195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204238.50209: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204238.50220: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204238.50231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204238.50243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204238.50263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204238.50279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204238.50290: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204238.50311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204238.50392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204238.50421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204238.50438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204238.50515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204238.52184: stdout chunk (state=3): >>>/root <<< 26264 1727204238.52384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204238.52387: stdout chunk (state=3): >>><<< 26264 1727204238.52392: stderr chunk (state=3): >>><<< 26264 1727204238.52523: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204238.52535: _low_level_execute_command(): starting 26264 1727204238.52539: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204238.5242517-26432-195929088802997 `" && echo ansible-tmp-1727204238.5242517-26432-195929088802997="` echo /root/.ansible/tmp/ansible-tmp-1727204238.5242517-26432-195929088802997 `" ) && sleep 0' 26264 1727204238.53695: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204238.53708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204238.53729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204238.53746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204238.53792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204238.53804: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204238.53819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204238.53845: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204238.53862: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204238.53876: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204238.53888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204238.53900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204238.53926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204238.53939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204238.53959: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204238.53980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204238.54075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204238.54097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204238.54114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204238.54204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204238.56656: stdout chunk (state=3): >>>ansible-tmp-1727204238.5242517-26432-195929088802997=/root/.ansible/tmp/ansible-tmp-1727204238.5242517-26432-195929088802997 <<< 26264 1727204238.56972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204238.57285: stderr chunk (state=3): >>><<< 26264 1727204238.57289: stdout chunk (state=3): >>><<< 26264 1727204238.57489: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204238.5242517-26432-195929088802997=/root/.ansible/tmp/ansible-tmp-1727204238.5242517-26432-195929088802997 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204238.57492: variable 'ansible_module_compression' from source: unknown 26264 1727204238.57494: ANSIBALLZ: Using lock for stat 26264 1727204238.57496: ANSIBALLZ: Acquiring lock 26264 1727204238.57498: ANSIBALLZ: Lock acquired: 139841027930592 26264 1727204238.57500: ANSIBALLZ: Creating module 26264 1727204238.80661: ANSIBALLZ: Writing module into payload 26264 1727204238.80782: ANSIBALLZ: Writing module 26264 1727204238.80816: ANSIBALLZ: Renaming module 26264 1727204238.80852: ANSIBALLZ: Done creating module 26264 1727204238.80900: variable 'ansible_facts' from source: unknown 26264 1727204238.81012: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204238.5242517-26432-195929088802997/AnsiballZ_stat.py 26264 1727204238.81174: Sending initial data 26264 1727204238.81177: Sent initial data (153 bytes) 26264 1727204238.82167: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204238.82186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204238.82201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204238.82224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204238.82267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204238.82280: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204238.82297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204238.82318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204238.82330: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204238.82340: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204238.82351: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204238.82366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204238.82381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204238.82393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204238.82408: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204238.82421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204238.82506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204238.82530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204238.82551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204238.82644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204238.85222: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204238.85318: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204238.85333: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpi3027r0w /root/.ansible/tmp/ansible-tmp-1727204238.5242517-26432-195929088802997/AnsiballZ_stat.py <<< 26264 1727204238.85374: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204238.86838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204238.86951: stderr chunk (state=3): >>><<< 26264 1727204238.86954: stdout chunk (state=3): >>><<< 26264 1727204238.86956: done transferring module to remote 26264 1727204238.86958: _low_level_execute_command(): starting 26264 1727204238.86960: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204238.5242517-26432-195929088802997/ /root/.ansible/tmp/ansible-tmp-1727204238.5242517-26432-195929088802997/AnsiballZ_stat.py && sleep 0' 26264 1727204238.87488: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204238.87982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204238.87998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204238.88017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204238.88058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204238.88075: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204238.88089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204238.88106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204238.88117: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204238.88129: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204238.88140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204238.88153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204238.88171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204238.88183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204238.88193: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204238.88257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204238.88329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204238.88441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204238.88457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204238.88579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204238.91083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204238.91168: stderr chunk (state=3): >>><<< 26264 1727204238.91172: stdout chunk (state=3): >>><<< 26264 1727204238.91266: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204238.91270: _low_level_execute_command(): starting 26264 1727204238.91273: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204238.5242517-26432-195929088802997/AnsiballZ_stat.py && sleep 0' 26264 1727204238.91840: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204238.91855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204238.91874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204238.91893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204238.91936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204238.91948: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204238.91963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204238.91984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204238.91997: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204238.92008: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204238.92022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204238.92037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204238.92054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204238.92069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204238.92081: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204238.92095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204238.92176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204238.92198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204238.92215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204238.92301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204238.95182: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 26264 1727204238.95262: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 26264 1727204238.95314: stdout chunk (state=3): >>>import 'posix' # <<< 26264 1727204238.95376: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 26264 1727204238.95440: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 26264 1727204238.95443: stdout chunk (state=3): >>># installed zipimport hook <<< 26264 1727204238.95506: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204238.95533: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 26264 1727204238.95574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 26264 1727204238.95602: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde98dc0> <<< 26264 1727204238.95662: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 26264 1727204238.95683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde3d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde98b20> <<< 26264 1727204238.95722: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 26264 1727204238.95725: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 26264 1727204238.95736: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde98ac0> <<< 26264 1727204238.95773: stdout chunk (state=3): >>>import '_signal' # <<< 26264 1727204238.95799: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde3d490> <<< 26264 1727204238.95851: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 26264 1727204238.95878: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 26264 1727204238.95898: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde3d940> <<< 26264 1727204238.95922: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde3d670> <<< 26264 1727204238.95963: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 26264 1727204238.95984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 26264 1727204238.95995: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 26264 1727204238.96034: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 26264 1727204238.96046: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 26264 1727204238.96073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 26264 1727204238.96102: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdbcf190> <<< 26264 1727204238.96139: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 26264 1727204238.96163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 26264 1727204238.96273: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdbcf220> <<< 26264 1727204238.96306: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 26264 1727204238.96324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 26264 1727204238.96375: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdbf2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdbcf940> <<< 26264 1727204238.96400: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde55880> <<< 26264 1727204238.96443: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 26264 1727204238.96447: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdbc8d90> <<< 26264 1727204238.96525: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 26264 1727204238.96527: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdbf2d90> <<< 26264 1727204238.96599: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde3d970> <<< 26264 1727204238.96639: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 26264 1727204238.96953: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 26264 1727204238.96987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 26264 1727204238.97011: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 26264 1727204238.97043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 26264 1727204238.97046: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 26264 1727204238.97094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 26264 1727204238.97104: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 26264 1727204238.97128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb6df10> <<< 26264 1727204238.97188: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb740a0> <<< 26264 1727204238.97200: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 26264 1727204238.97234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 26264 1727204238.97270: stdout chunk (state=3): >>>import '_sre' # <<< 26264 1727204238.97282: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 26264 1727204238.97304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 26264 1727204238.97316: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 26264 1727204238.97368: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb675b0> <<< 26264 1727204238.97387: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb6e6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb6d3d0> <<< 26264 1727204238.97397: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 26264 1727204238.97492: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 26264 1727204238.97517: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 26264 1727204238.97580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204238.97591: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 26264 1727204238.97629: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abdaf1eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdaf19a0> <<< 26264 1727204238.97654: stdout chunk (state=3): >>>import 'itertools' # <<< 26264 1727204238.97688: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdaf1fa0> <<< 26264 1727204238.97714: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 26264 1727204238.97731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 26264 1727204238.97762: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdaf1df0> <<< 26264 1727204238.97803: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb01160> <<< 26264 1727204238.97821: stdout chunk (state=3): >>>import '_collections' # <<< 26264 1727204238.97891: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb49e20> import '_functools' # <<< 26264 1727204238.97921: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb41700> <<< 26264 1727204238.97988: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 26264 1727204238.98017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb55760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb75eb0> <<< 26264 1727204238.98033: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 26264 1727204238.98067: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abdb01d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb49340> <<< 26264 1727204238.98111: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abdb55370> <<< 26264 1727204238.98138: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb7ba60> <<< 26264 1727204238.98159: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 26264 1727204238.98184: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204238.98209: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 26264 1727204238.98250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 26264 1727204238.98262: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb01f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb01e80> <<< 26264 1727204238.98291: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb01df0> <<< 26264 1727204238.98318: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 26264 1727204238.98328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 26264 1727204238.98367: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 26264 1727204238.98393: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 26264 1727204238.98477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 26264 1727204238.98502: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdad5460> <<< 26264 1727204238.98535: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 26264 1727204238.98554: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 26264 1727204238.98595: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdad5550> <<< 26264 1727204238.98776: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdab30d0> <<< 26264 1727204238.98832: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb04b20> <<< 26264 1727204238.98843: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb044c0> <<< 26264 1727204238.98874: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 26264 1727204238.98884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 26264 1727204238.98934: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 26264 1727204238.98945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 26264 1727204238.98982: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 26264 1727204238.98993: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd9fe2b0> <<< 26264 1727204238.99037: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdac0d60> <<< 26264 1727204238.99106: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb04fa0> <<< 26264 1727204238.99125: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb7b0d0> <<< 26264 1727204238.99147: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 26264 1727204238.99173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 26264 1727204238.99201: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abda0ebe0> <<< 26264 1727204238.99239: stdout chunk (state=3): >>>import 'errno' # <<< 26264 1727204238.99272: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abda0ef10> <<< 26264 1727204238.99300: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 26264 1727204238.99351: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 26264 1727204238.99373: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abda21820> <<< 26264 1727204238.99385: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 26264 1727204238.99420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 26264 1727204238.99460: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abda21d60> <<< 26264 1727204238.99502: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9af490> <<< 26264 1727204238.99512: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abda0ef40> <<< 26264 1727204238.99540: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 26264 1727204238.99550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 26264 1727204238.99607: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9bf370> <<< 26264 1727204238.99627: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abda216a0> <<< 26264 1727204238.99637: stdout chunk (state=3): >>>import 'pwd' # <<< 26264 1727204238.99668: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9bf430> <<< 26264 1727204238.99726: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb01ac0> <<< 26264 1727204238.99759: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 26264 1727204238.99777: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 26264 1727204238.99810: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 26264 1727204238.99824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 26264 1727204238.99859: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9db790> <<< 26264 1727204238.99891: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 26264 1727204238.99932: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9dba60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd9db850> <<< 26264 1727204238.99962: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9db940> <<< 26264 1727204239.00008: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 26264 1727204239.00320: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9dbd90> <<< 26264 1727204239.00368: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9e52e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd9db9d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd9cfb20> <<< 26264 1727204239.00407: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb016a0> <<< 26264 1727204239.00410: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 26264 1727204239.00483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 26264 1727204239.00527: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd9dbb80> <<< 26264 1727204239.00701: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3abd3d2760> <<< 26264 1727204239.00909: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip' # zipimport: zlib available <<< 26264 1727204239.01054: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.01108: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 26264 1727204239.01156: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 26264 1727204239.01171: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.03233: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.05198: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2f98b0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204239.05256: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 26264 1727204239.05281: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 26264 1727204239.05321: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204239.05324: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd2f9160> <<< 26264 1727204239.05380: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2f9280> <<< 26264 1727204239.05422: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2f95e0> <<< 26264 1727204239.05471: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 26264 1727204239.05485: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 26264 1727204239.05540: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2f94f0> <<< 26264 1727204239.05596: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2f9e20> <<< 26264 1727204239.05602: stdout chunk (state=3): >>>import 'atexit' # <<< 26264 1727204239.05677: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so'<<< 26264 1727204239.05692: stdout chunk (state=3): >>> import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd2f9580> <<< 26264 1727204239.05735: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 26264 1727204239.05791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 26264 1727204239.05873: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2f9100> <<< 26264 1727204239.05914: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py<<< 26264 1727204239.05926: stdout chunk (state=3): >>> <<< 26264 1727204239.05958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 26264 1727204239.05992: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 26264 1727204239.06042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 26264 1727204239.06087: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 26264 1727204239.06205: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd24ffd0> <<< 26264 1727204239.06268: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204239.06280: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd26ec40> <<< 26264 1727204239.06333: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd26ef40> <<< 26264 1727204239.06361: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 26264 1727204239.06408: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 26264 1727204239.06467: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd26e2e0> <<< 26264 1727204239.06496: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd361d90> <<< 26264 1727204239.06776: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd3613a0> <<< 26264 1727204239.06798: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 26264 1727204239.06836: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd361f40> <<< 26264 1727204239.06881: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 26264 1727204239.06893: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 26264 1727204239.06931: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 26264 1727204239.06943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 26264 1727204239.06972: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 26264 1727204239.06996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 26264 1727204239.07054: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd3d2a90> <<< 26264 1727204239.07212: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2ccdc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2cc490> <<< 26264 1727204239.07224: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd303580> <<< 26264 1727204239.07267: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204239.07280: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd2cc5b0> <<< 26264 1727204239.07337: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2cc5e0> <<< 26264 1727204239.07403: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 26264 1727204239.07416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 26264 1727204239.07459: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 26264 1727204239.07509: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 26264 1727204239.07621: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204239.07671: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd241f70> <<< 26264 1727204239.07698: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd3412e0> <<< 26264 1727204239.07711: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 26264 1727204239.07745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 26264 1727204239.07834: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204239.07865: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd23e7f0> <<< 26264 1727204239.07901: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd341460> <<< 26264 1727204239.07934: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 26264 1727204239.08016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204239.08067: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 26264 1727204239.08093: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 26264 1727204239.08113: stdout chunk (state=3): >>>import '_string' # <<< 26264 1727204239.08474: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd359f40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd23e790> <<< 26264 1727204239.08569: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204239.08585: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd23e5e0> <<< 26264 1727204239.08635: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204239.08661: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd23d550> <<< 26264 1727204239.08741: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204239.08770: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd23d490> <<< 26264 1727204239.08781: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd3389a0> <<< 26264 1727204239.08825: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 26264 1727204239.08858: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 26264 1727204239.08891: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 26264 1727204239.08976: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204239.09002: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd2c26a0> <<< 26264 1727204239.09318: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204239.09366: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd2c0bb0> <<< 26264 1727204239.09394: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2d10d0> <<< 26264 1727204239.09466: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 26264 1727204239.09498: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd2c2100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd305c40> <<< 26264 1727204239.09528: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.09558: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.09591: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 26264 1727204239.09620: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.09752: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.09875: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.09909: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.09940: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 26264 1727204239.09969: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.09994: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.10024: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 26264 1727204239.10058: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.10216: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.10415: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.11212: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.12088: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 26264 1727204239.12091: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 26264 1727204239.12093: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 26264 1727204239.12095: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 26264 1727204239.12125: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 26264 1727204239.12154: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204239.12256: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so'<<< 26264 1727204239.12268: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd20e940> <<< 26264 1727204239.12384: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 26264 1727204239.12392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 26264 1727204239.12424: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2bfd30> <<< 26264 1727204239.12452: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2b67c0><<< 26264 1727204239.12456: stdout chunk (state=3): >>> <<< 26264 1727204239.12524: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 26264 1727204239.12552: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204239.12556: stdout chunk (state=3): >>> <<< 26264 1727204239.12585: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.12614: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/_text.py <<< 26264 1727204239.12642: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.12850: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.13068: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py<<< 26264 1727204239.13072: stdout chunk (state=3): >>> <<< 26264 1727204239.13075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc'<<< 26264 1727204239.13077: stdout chunk (state=3): >>> <<< 26264 1727204239.13119: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2c04c0><<< 26264 1727204239.13125: stdout chunk (state=3): >>> <<< 26264 1727204239.13150: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204239.13153: stdout chunk (state=3): >>> <<< 26264 1727204239.13894: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.14542: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204239.14544: stdout chunk (state=3): >>> <<< 26264 1727204239.14621: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.14743: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/collections.py <<< 26264 1727204239.14767: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.14807: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.14881: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py <<< 26264 1727204239.14900: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.14982: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.15121: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 26264 1727204239.15150: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.15209: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py<<< 26264 1727204239.15213: stdout chunk (state=3): >>> <<< 26264 1727204239.15233: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.15279: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.15366: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 26264 1727204239.15380: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.15782: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.16036: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 26264 1727204239.16087: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 26264 1727204239.16120: stdout chunk (state=3): >>>import '_ast' # <<< 26264 1727204239.16245: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abcda5940> <<< 26264 1727204239.16260: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.16369: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204239.16372: stdout chunk (state=3): >>> <<< 26264 1727204239.16472: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py <<< 26264 1727204239.16500: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 26264 1727204239.16530: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 26264 1727204239.16574: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.16635: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.16706: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/locale.py<<< 26264 1727204239.16721: stdout chunk (state=3): >>> <<< 26264 1727204239.16732: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.16799: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.16865: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.17020: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204239.17023: stdout chunk (state=3): >>> <<< 26264 1727204239.17123: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py<<< 26264 1727204239.17126: stdout chunk (state=3): >>> <<< 26264 1727204239.17174: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 26264 1727204239.17299: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so'<<< 26264 1727204239.17331: stdout chunk (state=3): >>> # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd34cb50><<< 26264 1727204239.17336: stdout chunk (state=3): >>> <<< 26264 1727204239.17385: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abcdadfa0> <<< 26264 1727204239.17464: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/file.py <<< 26264 1727204239.17490: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 26264 1727204239.17503: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.17741: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204239.17744: stdout chunk (state=3): >>> <<< 26264 1727204239.17832: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.17890: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204239.17895: stdout chunk (state=3): >>> <<< 26264 1727204239.17962: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 26264 1727204239.18011: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 26264 1727204239.18046: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 26264 1727204239.18128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc'<<< 26264 1727204239.18131: stdout chunk (state=3): >>> <<< 26264 1727204239.18171: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 26264 1727204239.18217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 26264 1727204239.18395: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abcdf56d0><<< 26264 1727204239.18407: stdout chunk (state=3): >>> <<< 26264 1727204239.18555: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd200c10> <<< 26264 1727204239.18604: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd1fe5b0><<< 26264 1727204239.18630: stdout chunk (state=3): >>> # destroy ansible.module_utils.distro<<< 26264 1727204239.18659: stdout chunk (state=3): >>> import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 26264 1727204239.18691: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204239.18701: stdout chunk (state=3): >>> <<< 26264 1727204239.18743: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.18788: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py <<< 26264 1727204239.18806: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py<<< 26264 1727204239.18818: stdout chunk (state=3): >>> <<< 26264 1727204239.18928: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/basic.py<<< 26264 1727204239.18930: stdout chunk (state=3): >>> <<< 26264 1727204239.18958: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204239.18973: stdout chunk (state=3): >>> <<< 26264 1727204239.19001: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204239.19016: stdout chunk (state=3): >>> import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 26264 1727204239.19045: stdout chunk (state=3): >>># zipimport: zlib available<<< 26264 1727204239.19056: stdout chunk (state=3): >>> <<< 26264 1727204239.19266: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.19587: stdout chunk (state=3): >>># zipimport: zlib available <<< 26264 1727204239.19810: stdout chunk (state=3): >>> <<< 26264 1727204239.19824: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 26264 1727204239.19860: stdout chunk (state=3): >>># destroy __main__ <<< 26264 1727204239.20384: stdout chunk (state=3): >>># clear builtins._ # clear sys.path <<< 26264 1727204239.20425: stdout chunk (state=3): >>># clear sys.argv # clear sys.ps1 # clear sys.ps2<<< 26264 1727204239.20513: stdout chunk (state=3): >>> # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks <<< 26264 1727204239.20654: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread<<< 26264 1727204239.20797: stdout chunk (state=3): >>> # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings <<< 26264 1727204239.21014: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc <<< 26264 1727204239.21103: stdout chunk (state=3): >>># cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants<<< 26264 1727204239.21328: stdout chunk (state=3): >>> # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib <<< 26264 1727204239.21417: stdout chunk (state=3): >>># cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib<<< 26264 1727204239.21458: stdout chunk (state=3): >>> # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil <<< 26264 1727204239.21488: stdout chunk (state=3): >>># destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2<<< 26264 1727204239.21491: stdout chunk (state=3): >>> # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib <<< 26264 1727204239.21517: stdout chunk (state=3): >>># cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder<<< 26264 1727204239.21542: stdout chunk (state=3): >>> # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess<<< 26264 1727204239.21578: stdout chunk (state=3): >>> # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging <<< 26264 1727204239.21591: stdout chunk (state=3): >>># cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat<<< 26264 1727204239.21641: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy<<< 26264 1727204239.21687: stdout chunk (state=3): >>> # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext<<< 26264 1727204239.21723: stdout chunk (state=3): >>> # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 26264 1727204239.22223: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 26264 1727204239.22257: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 26264 1727204239.22323: stdout chunk (state=3): >>># destroy zipimport <<< 26264 1727204239.22403: stdout chunk (state=3): >>># destroy _compression <<< 26264 1727204239.22441: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 26264 1727204239.22574: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux <<< 26264 1727204239.22609: stdout chunk (state=3): >>># destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 26264 1727204239.22612: stdout chunk (state=3): >>># destroy encodings <<< 26264 1727204239.22662: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 26264 1727204239.22715: stdout chunk (state=3): >>># destroy array <<< 26264 1727204239.22719: stdout chunk (state=3): >>># destroy datetime <<< 26264 1727204239.22791: stdout chunk (state=3): >>># destroy selinux # destroy distro <<< 26264 1727204239.22831: stdout chunk (state=3): >>># destroy json # destroy shlex # destroy logging # destroy argparse <<< 26264 1727204239.22955: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 26264 1727204239.23054: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 26264 1727204239.23173: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string <<< 26264 1727204239.23203: stdout chunk (state=3): >>># cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 26264 1727204239.23324: stdout chunk (state=3): >>># cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit <<< 26264 1727204239.23405: stdout chunk (state=3): >>># cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 26264 1727204239.23544: stdout chunk (state=3): >>># cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd<<< 26264 1727204239.23681: stdout chunk (state=3): >>> # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap<<< 26264 1727204239.23809: stdout chunk (state=3): >>> # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools <<< 26264 1727204239.23872: stdout chunk (state=3): >>># cleanup[3] wiping collections # destroy _collections_abc # destroy heapq <<< 26264 1727204239.23943: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 26264 1727204239.24024: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 26264 1727204239.24087: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat <<< 26264 1727204239.24107: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc<<< 26264 1727204239.24119: stdout chunk (state=3): >>> # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 26264 1727204239.24144: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 26264 1727204239.24180: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 26264 1727204239.24204: stdout chunk (state=3): >>> # cleanup[3] wiping builtins # destroy systemd._daemon <<< 26264 1727204239.24217: stdout chunk (state=3): >>># destroy _socket # destroy systemd.id128<<< 26264 1727204239.24323: stdout chunk (state=3): >>> # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 26264 1727204239.24509: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 26264 1727204239.24590: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse <<< 26264 1727204239.24619: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq # destroy posixpath <<< 26264 1727204239.24695: stdout chunk (state=3): >>># destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal<<< 26264 1727204239.24736: stdout chunk (state=3): >>> # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 26264 1727204239.24894: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request<<< 26264 1727204239.24967: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 26264 1727204239.25053: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 26264 1727204239.25561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204239.25572: stdout chunk (state=3): >>><<< 26264 1727204239.25588: stderr chunk (state=3): >>><<< 26264 1727204239.25696: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde98dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde3d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde98b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde98ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde3d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde3d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde3d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdbcf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdbcf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdbf2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdbcf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde55880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdbc8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdbf2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abde3d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb6df10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb740a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb675b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb6e6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb6d3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abdaf1eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdaf19a0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdaf1fa0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdaf1df0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb01160> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb49e20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb41700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb55760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb75eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abdb01d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb49340> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abdb55370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb7ba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb01f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb01e80> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb01df0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdad5460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdad5550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdab30d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb04b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb044c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd9fe2b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdac0d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb04fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb7b0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abda0ebe0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abda0ef10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abda21820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abda21d60> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9af490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abda0ef40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9bf370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abda216a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9bf430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb01ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9db790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9dba60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd9db850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9db940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9dbd90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd9e52e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd9db9d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd9cfb20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abdb016a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd9dbb80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3abd3d2760> # zipimport: found 30 names in '/tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2f98b0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd2f9160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2f9280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2f95e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2f94f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2f9e20> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd2f9580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2f9100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd24ffd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd26ec40> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd26ef40> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd26e2e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd361d90> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd3613a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd361f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd3d2a90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2ccdc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2cc490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd303580> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd2cc5b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2cc5e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd241f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd3412e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd23e7f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd341460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd359f40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd23e790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd23e5e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd23d550> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd23d490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd3389a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd2c26a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd2c0bb0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2d10d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd2c2100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd305c40> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd20e940> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2bfd30> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2b67c0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd2c04c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abcda5940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3abd34cb50> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abcdadfa0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abcdf56d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd200c10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3abd1fe5b0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_xas_ugnc/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 26264 1727204239.26437: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204238.5242517-26432-195929088802997/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204239.26440: _low_level_execute_command(): starting 26264 1727204239.26443: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204238.5242517-26432-195929088802997/ > /dev/null 2>&1 && sleep 0' 26264 1727204239.27123: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204239.27137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204239.27156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204239.27183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204239.27230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204239.27242: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204239.27260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204239.27280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204239.27301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204239.27314: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204239.27327: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204239.27341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204239.27361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204239.27377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204239.27389: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204239.27411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204239.27492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204239.27521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204239.27538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204239.27636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204239.30286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204239.30342: stderr chunk (state=3): >>><<< 26264 1727204239.30346: stdout chunk (state=3): >>><<< 26264 1727204239.30577: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204239.30580: handler run complete 26264 1727204239.30583: attempt loop complete, returning result 26264 1727204239.30585: _execute() done 26264 1727204239.30587: dumping result to json 26264 1727204239.30590: done dumping result, returning 26264 1727204239.30592: done running TaskExecutor() for managed-node3/TASK: Check if system is ostree [0affcd87-79f5-5ff5-08b0-00000000008f] 26264 1727204239.30594: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000008f 26264 1727204239.30662: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000008f 26264 1727204239.30668: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 26264 1727204239.30738: no more pending results, returning what we have 26264 1727204239.30741: results queue empty 26264 1727204239.30742: checking for any_errors_fatal 26264 1727204239.30750: done checking for any_errors_fatal 26264 1727204239.30751: checking for max_fail_percentage 26264 1727204239.30753: done checking for max_fail_percentage 26264 1727204239.30754: checking to see if all hosts have failed and the running result is not ok 26264 1727204239.30755: done checking to see if all hosts have failed 26264 1727204239.30755: getting the remaining hosts for this loop 26264 1727204239.30757: done getting the remaining hosts for this loop 26264 1727204239.30762: getting the next task for host managed-node3 26264 1727204239.30776: done getting next task for host managed-node3 26264 1727204239.30779: ^ task is: TASK: Set flag to indicate system is ostree 26264 1727204239.30782: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204239.30785: getting variables 26264 1727204239.30788: in VariableManager get_vars() 26264 1727204239.30819: Calling all_inventory to load vars for managed-node3 26264 1727204239.30822: Calling groups_inventory to load vars for managed-node3 26264 1727204239.30827: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.30839: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.30842: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.30846: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.31022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.31446: done with get_vars() 26264 1727204239.31458: done getting variables 26264 1727204239.31572: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.882) 0:00:03.165 ***** 26264 1727204239.31610: entering _queue_task() for managed-node3/set_fact 26264 1727204239.31611: Creating lock for set_fact 26264 1727204239.31962: worker is 1 (out of 1 available) 26264 1727204239.31975: exiting _queue_task() for managed-node3/set_fact 26264 1727204239.31987: done queuing things up, now waiting for results queue to drain 26264 1727204239.31989: waiting for pending results... 26264 1727204239.32260: running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree 26264 1727204239.32392: in run() - task 0affcd87-79f5-5ff5-08b0-000000000090 26264 1727204239.32416: variable 'ansible_search_path' from source: unknown 26264 1727204239.32425: variable 'ansible_search_path' from source: unknown 26264 1727204239.32475: calling self._execute() 26264 1727204239.32576: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.32587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.32611: variable 'omit' from source: magic vars 26264 1727204239.33125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204239.33392: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204239.33444: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204239.33492: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204239.33533: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204239.33623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204239.33653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204239.33695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204239.33734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204239.33877: Evaluated conditional (not __network_is_ostree is defined): True 26264 1727204239.33889: variable 'omit' from source: magic vars 26264 1727204239.33943: variable 'omit' from source: magic vars 26264 1727204239.34085: variable '__ostree_booted_stat' from source: set_fact 26264 1727204239.34146: variable 'omit' from source: magic vars 26264 1727204239.34178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204239.34209: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204239.34499: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204239.34523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204239.34538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204239.34584: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204239.34593: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.34603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.34720: Set connection var ansible_pipelining to False 26264 1727204239.34729: Set connection var ansible_connection to ssh 26264 1727204239.34735: Set connection var ansible_shell_type to sh 26264 1727204239.34746: Set connection var ansible_shell_executable to /bin/sh 26264 1727204239.34762: Set connection var ansible_timeout to 10 26264 1727204239.34778: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204239.34814: variable 'ansible_shell_executable' from source: unknown 26264 1727204239.34828: variable 'ansible_connection' from source: unknown 26264 1727204239.34836: variable 'ansible_module_compression' from source: unknown 26264 1727204239.34843: variable 'ansible_shell_type' from source: unknown 26264 1727204239.34853: variable 'ansible_shell_executable' from source: unknown 26264 1727204239.35168: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.35183: variable 'ansible_pipelining' from source: unknown 26264 1727204239.35192: variable 'ansible_timeout' from source: unknown 26264 1727204239.35200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.35317: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204239.35332: variable 'omit' from source: magic vars 26264 1727204239.35342: starting attempt loop 26264 1727204239.35350: running the handler 26264 1727204239.35367: handler run complete 26264 1727204239.35388: attempt loop complete, returning result 26264 1727204239.35400: _execute() done 26264 1727204239.35411: dumping result to json 26264 1727204239.35418: done dumping result, returning 26264 1727204239.35429: done running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree [0affcd87-79f5-5ff5-08b0-000000000090] 26264 1727204239.35438: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000090 ok: [managed-node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 26264 1727204239.35592: no more pending results, returning what we have 26264 1727204239.35596: results queue empty 26264 1727204239.35597: checking for any_errors_fatal 26264 1727204239.35603: done checking for any_errors_fatal 26264 1727204239.35604: checking for max_fail_percentage 26264 1727204239.35606: done checking for max_fail_percentage 26264 1727204239.35606: checking to see if all hosts have failed and the running result is not ok 26264 1727204239.35607: done checking to see if all hosts have failed 26264 1727204239.35608: getting the remaining hosts for this loop 26264 1727204239.35610: done getting the remaining hosts for this loop 26264 1727204239.35614: getting the next task for host managed-node3 26264 1727204239.35622: done getting next task for host managed-node3 26264 1727204239.35625: ^ task is: TASK: Fix CentOS6 Base repo 26264 1727204239.35629: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204239.35632: getting variables 26264 1727204239.35634: in VariableManager get_vars() 26264 1727204239.35668: Calling all_inventory to load vars for managed-node3 26264 1727204239.35672: Calling groups_inventory to load vars for managed-node3 26264 1727204239.35676: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.35688: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.35691: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.35694: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.35875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.36082: done with get_vars() 26264 1727204239.36101: done getting variables 26264 1727204239.36344: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 26264 1727204239.36370: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000090 26264 1727204239.36373: WORKER PROCESS EXITING TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.047) 0:00:03.213 ***** 26264 1727204239.36387: entering _queue_task() for managed-node3/copy 26264 1727204239.37772: worker is 1 (out of 1 available) 26264 1727204239.37785: exiting _queue_task() for managed-node3/copy 26264 1727204239.37796: done queuing things up, now waiting for results queue to drain 26264 1727204239.37798: waiting for pending results... 26264 1727204239.38427: running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo 26264 1727204239.38711: in run() - task 0affcd87-79f5-5ff5-08b0-000000000092 26264 1727204239.38722: variable 'ansible_search_path' from source: unknown 26264 1727204239.38726: variable 'ansible_search_path' from source: unknown 26264 1727204239.38757: calling self._execute() 26264 1727204239.38825: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.38830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.38844: variable 'omit' from source: magic vars 26264 1727204239.39912: variable 'ansible_distribution' from source: facts 26264 1727204239.39936: Evaluated conditional (ansible_distribution == 'CentOS'): True 26264 1727204239.40227: variable 'ansible_distribution_major_version' from source: facts 26264 1727204239.40234: Evaluated conditional (ansible_distribution_major_version == '6'): False 26264 1727204239.40237: when evaluation is False, skipping this task 26264 1727204239.40239: _execute() done 26264 1727204239.40242: dumping result to json 26264 1727204239.40246: done dumping result, returning 26264 1727204239.40258: done running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo [0affcd87-79f5-5ff5-08b0-000000000092] 26264 1727204239.40261: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000092 26264 1727204239.40353: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000092 26264 1727204239.40356: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 26264 1727204239.40437: no more pending results, returning what we have 26264 1727204239.40441: results queue empty 26264 1727204239.40442: checking for any_errors_fatal 26264 1727204239.40450: done checking for any_errors_fatal 26264 1727204239.40451: checking for max_fail_percentage 26264 1727204239.40453: done checking for max_fail_percentage 26264 1727204239.40454: checking to see if all hosts have failed and the running result is not ok 26264 1727204239.40455: done checking to see if all hosts have failed 26264 1727204239.40456: getting the remaining hosts for this loop 26264 1727204239.40457: done getting the remaining hosts for this loop 26264 1727204239.40462: getting the next task for host managed-node3 26264 1727204239.40478: done getting next task for host managed-node3 26264 1727204239.40482: ^ task is: TASK: Include the task 'enable_epel.yml' 26264 1727204239.40485: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204239.40488: getting variables 26264 1727204239.40490: in VariableManager get_vars() 26264 1727204239.40519: Calling all_inventory to load vars for managed-node3 26264 1727204239.40522: Calling groups_inventory to load vars for managed-node3 26264 1727204239.40526: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.40539: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.40543: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.40550: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.40966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.41285: done with get_vars() 26264 1727204239.41294: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.051) 0:00:03.264 ***** 26264 1727204239.41508: entering _queue_task() for managed-node3/include_tasks 26264 1727204239.42253: worker is 1 (out of 1 available) 26264 1727204239.42266: exiting _queue_task() for managed-node3/include_tasks 26264 1727204239.42278: done queuing things up, now waiting for results queue to drain 26264 1727204239.42280: waiting for pending results... 26264 1727204239.42725: running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' 26264 1727204239.42841: in run() - task 0affcd87-79f5-5ff5-08b0-000000000093 26264 1727204239.42869: variable 'ansible_search_path' from source: unknown 26264 1727204239.42878: variable 'ansible_search_path' from source: unknown 26264 1727204239.42924: calling self._execute() 26264 1727204239.43014: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.43024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.43036: variable 'omit' from source: magic vars 26264 1727204239.43606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204239.46518: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204239.46590: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204239.46630: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204239.46671: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204239.46701: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204239.46782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204239.46815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204239.46847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204239.46914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204239.46935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204239.47051: variable '__network_is_ostree' from source: set_fact 26264 1727204239.47077: Evaluated conditional (not __network_is_ostree | d(false)): True 26264 1727204239.47089: _execute() done 26264 1727204239.47097: dumping result to json 26264 1727204239.47104: done dumping result, returning 26264 1727204239.47114: done running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' [0affcd87-79f5-5ff5-08b0-000000000093] 26264 1727204239.47124: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000093 26264 1727204239.47257: no more pending results, returning what we have 26264 1727204239.47262: in VariableManager get_vars() 26264 1727204239.47297: Calling all_inventory to load vars for managed-node3 26264 1727204239.47301: Calling groups_inventory to load vars for managed-node3 26264 1727204239.47304: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.47315: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.47318: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.47321: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.47496: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000093 26264 1727204239.47501: WORKER PROCESS EXITING 26264 1727204239.47513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.47740: done with get_vars() 26264 1727204239.47750: variable 'ansible_search_path' from source: unknown 26264 1727204239.47751: variable 'ansible_search_path' from source: unknown 26264 1727204239.47799: we have included files to process 26264 1727204239.47801: generating all_blocks data 26264 1727204239.47803: done generating all_blocks data 26264 1727204239.47807: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 26264 1727204239.47809: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 26264 1727204239.47811: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 26264 1727204239.48899: done processing included file 26264 1727204239.48901: iterating over new_blocks loaded from include file 26264 1727204239.48903: in VariableManager get_vars() 26264 1727204239.48915: done with get_vars() 26264 1727204239.48917: filtering new block on tags 26264 1727204239.48940: done filtering new block on tags 26264 1727204239.48942: in VariableManager get_vars() 26264 1727204239.48954: done with get_vars() 26264 1727204239.48956: filtering new block on tags 26264 1727204239.48968: done filtering new block on tags 26264 1727204239.48970: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node3 26264 1727204239.48976: extending task lists for all hosts with included blocks 26264 1727204239.49091: done extending task lists 26264 1727204239.49092: done processing included files 26264 1727204239.49093: results queue empty 26264 1727204239.49094: checking for any_errors_fatal 26264 1727204239.49097: done checking for any_errors_fatal 26264 1727204239.49098: checking for max_fail_percentage 26264 1727204239.49105: done checking for max_fail_percentage 26264 1727204239.49106: checking to see if all hosts have failed and the running result is not ok 26264 1727204239.49107: done checking to see if all hosts have failed 26264 1727204239.49108: getting the remaining hosts for this loop 26264 1727204239.49109: done getting the remaining hosts for this loop 26264 1727204239.49115: getting the next task for host managed-node3 26264 1727204239.49119: done getting next task for host managed-node3 26264 1727204239.49121: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 26264 1727204239.49124: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204239.49126: getting variables 26264 1727204239.49127: in VariableManager get_vars() 26264 1727204239.49135: Calling all_inventory to load vars for managed-node3 26264 1727204239.49137: Calling groups_inventory to load vars for managed-node3 26264 1727204239.49140: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.49145: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.49155: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.49158: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.49311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.49565: done with get_vars() 26264 1727204239.49573: done getting variables 26264 1727204239.49639: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 26264 1727204239.49869: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.084) 0:00:03.348 ***** 26264 1727204239.49920: entering _queue_task() for managed-node3/command 26264 1727204239.49921: Creating lock for command 26264 1727204239.50236: worker is 1 (out of 1 available) 26264 1727204239.50252: exiting _queue_task() for managed-node3/command 26264 1727204239.50271: done queuing things up, now waiting for results queue to drain 26264 1727204239.50273: waiting for pending results... 26264 1727204239.50570: running TaskExecutor() for managed-node3/TASK: Create EPEL 9 26264 1727204239.50688: in run() - task 0affcd87-79f5-5ff5-08b0-0000000000ad 26264 1727204239.50711: variable 'ansible_search_path' from source: unknown 26264 1727204239.50721: variable 'ansible_search_path' from source: unknown 26264 1727204239.50777: calling self._execute() 26264 1727204239.50869: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.50881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.50895: variable 'omit' from source: magic vars 26264 1727204239.51307: variable 'ansible_distribution' from source: facts 26264 1727204239.51324: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 26264 1727204239.51516: variable 'ansible_distribution_major_version' from source: facts 26264 1727204239.51526: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 26264 1727204239.51533: when evaluation is False, skipping this task 26264 1727204239.51540: _execute() done 26264 1727204239.51547: dumping result to json 26264 1727204239.51559: done dumping result, returning 26264 1727204239.51573: done running TaskExecutor() for managed-node3/TASK: Create EPEL 9 [0affcd87-79f5-5ff5-08b0-0000000000ad] 26264 1727204239.51586: sending task result for task 0affcd87-79f5-5ff5-08b0-0000000000ad skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 26264 1727204239.51776: no more pending results, returning what we have 26264 1727204239.51780: results queue empty 26264 1727204239.51781: checking for any_errors_fatal 26264 1727204239.51782: done checking for any_errors_fatal 26264 1727204239.51783: checking for max_fail_percentage 26264 1727204239.51785: done checking for max_fail_percentage 26264 1727204239.51786: checking to see if all hosts have failed and the running result is not ok 26264 1727204239.51787: done checking to see if all hosts have failed 26264 1727204239.51787: getting the remaining hosts for this loop 26264 1727204239.51789: done getting the remaining hosts for this loop 26264 1727204239.51793: getting the next task for host managed-node3 26264 1727204239.51801: done getting next task for host managed-node3 26264 1727204239.51803: ^ task is: TASK: Install yum-utils package 26264 1727204239.51808: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204239.51812: getting variables 26264 1727204239.51814: in VariableManager get_vars() 26264 1727204239.51844: Calling all_inventory to load vars for managed-node3 26264 1727204239.51850: Calling groups_inventory to load vars for managed-node3 26264 1727204239.51854: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.51870: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.51873: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.51876: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.52073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.52299: done with get_vars() 26264 1727204239.52309: done getting variables 26264 1727204239.52522: done sending task result for task 0affcd87-79f5-5ff5-08b0-0000000000ad 26264 1727204239.52525: WORKER PROCESS EXITING 26264 1727204239.52573: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.026) 0:00:03.375 ***** 26264 1727204239.52609: entering _queue_task() for managed-node3/package 26264 1727204239.52611: Creating lock for package 26264 1727204239.53078: worker is 1 (out of 1 available) 26264 1727204239.53090: exiting _queue_task() for managed-node3/package 26264 1727204239.53101: done queuing things up, now waiting for results queue to drain 26264 1727204239.53102: waiting for pending results... 26264 1727204239.53982: running TaskExecutor() for managed-node3/TASK: Install yum-utils package 26264 1727204239.54221: in run() - task 0affcd87-79f5-5ff5-08b0-0000000000ae 26264 1727204239.54281: variable 'ansible_search_path' from source: unknown 26264 1727204239.54288: variable 'ansible_search_path' from source: unknown 26264 1727204239.54340: calling self._execute() 26264 1727204239.54459: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.54477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.54498: variable 'omit' from source: magic vars 26264 1727204239.54983: variable 'ansible_distribution' from source: facts 26264 1727204239.54999: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 26264 1727204239.55126: variable 'ansible_distribution_major_version' from source: facts 26264 1727204239.55138: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 26264 1727204239.55146: when evaluation is False, skipping this task 26264 1727204239.55154: _execute() done 26264 1727204239.55162: dumping result to json 26264 1727204239.55173: done dumping result, returning 26264 1727204239.55184: done running TaskExecutor() for managed-node3/TASK: Install yum-utils package [0affcd87-79f5-5ff5-08b0-0000000000ae] 26264 1727204239.55195: sending task result for task 0affcd87-79f5-5ff5-08b0-0000000000ae skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 26264 1727204239.55340: no more pending results, returning what we have 26264 1727204239.55344: results queue empty 26264 1727204239.55344: checking for any_errors_fatal 26264 1727204239.55356: done checking for any_errors_fatal 26264 1727204239.55356: checking for max_fail_percentage 26264 1727204239.55358: done checking for max_fail_percentage 26264 1727204239.55359: checking to see if all hosts have failed and the running result is not ok 26264 1727204239.55360: done checking to see if all hosts have failed 26264 1727204239.55360: getting the remaining hosts for this loop 26264 1727204239.55362: done getting the remaining hosts for this loop 26264 1727204239.55367: getting the next task for host managed-node3 26264 1727204239.55373: done getting next task for host managed-node3 26264 1727204239.55375: ^ task is: TASK: Enable EPEL 7 26264 1727204239.55379: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204239.55382: getting variables 26264 1727204239.55384: in VariableManager get_vars() 26264 1727204239.55466: Calling all_inventory to load vars for managed-node3 26264 1727204239.55469: Calling groups_inventory to load vars for managed-node3 26264 1727204239.55473: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.55482: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.55485: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.55487: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.55644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.55870: done with get_vars() 26264 1727204239.55879: done getting variables 26264 1727204239.55916: done sending task result for task 0affcd87-79f5-5ff5-08b0-0000000000ae 26264 1727204239.55920: WORKER PROCESS EXITING 26264 1727204239.55954: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.033) 0:00:03.409 ***** 26264 1727204239.55986: entering _queue_task() for managed-node3/command 26264 1727204239.56268: worker is 1 (out of 1 available) 26264 1727204239.56284: exiting _queue_task() for managed-node3/command 26264 1727204239.56297: done queuing things up, now waiting for results queue to drain 26264 1727204239.56299: waiting for pending results... 26264 1727204239.56589: running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 26264 1727204239.56716: in run() - task 0affcd87-79f5-5ff5-08b0-0000000000af 26264 1727204239.56732: variable 'ansible_search_path' from source: unknown 26264 1727204239.56739: variable 'ansible_search_path' from source: unknown 26264 1727204239.56798: calling self._execute() 26264 1727204239.56889: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.56911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.56925: variable 'omit' from source: magic vars 26264 1727204239.57369: variable 'ansible_distribution' from source: facts 26264 1727204239.57387: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 26264 1727204239.57559: variable 'ansible_distribution_major_version' from source: facts 26264 1727204239.57571: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 26264 1727204239.57578: when evaluation is False, skipping this task 26264 1727204239.57584: _execute() done 26264 1727204239.57590: dumping result to json 26264 1727204239.57597: done dumping result, returning 26264 1727204239.57606: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 [0affcd87-79f5-5ff5-08b0-0000000000af] 26264 1727204239.57619: sending task result for task 0affcd87-79f5-5ff5-08b0-0000000000af skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 26264 1727204239.57784: no more pending results, returning what we have 26264 1727204239.57788: results queue empty 26264 1727204239.57789: checking for any_errors_fatal 26264 1727204239.57795: done checking for any_errors_fatal 26264 1727204239.57796: checking for max_fail_percentage 26264 1727204239.57797: done checking for max_fail_percentage 26264 1727204239.57798: checking to see if all hosts have failed and the running result is not ok 26264 1727204239.57799: done checking to see if all hosts have failed 26264 1727204239.57800: getting the remaining hosts for this loop 26264 1727204239.57802: done getting the remaining hosts for this loop 26264 1727204239.57806: getting the next task for host managed-node3 26264 1727204239.57812: done getting next task for host managed-node3 26264 1727204239.57815: ^ task is: TASK: Enable EPEL 8 26264 1727204239.57819: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204239.57823: getting variables 26264 1727204239.57825: in VariableManager get_vars() 26264 1727204239.57856: Calling all_inventory to load vars for managed-node3 26264 1727204239.57859: Calling groups_inventory to load vars for managed-node3 26264 1727204239.57865: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.57879: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.57882: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.57885: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.58070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.58270: done with get_vars() 26264 1727204239.58279: done getting variables 26264 1727204239.58642: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.026) 0:00:03.435 ***** 26264 1727204239.58680: entering _queue_task() for managed-node3/command 26264 1727204239.58700: done sending task result for task 0affcd87-79f5-5ff5-08b0-0000000000af 26264 1727204239.58703: WORKER PROCESS EXITING 26264 1727204239.59165: worker is 1 (out of 1 available) 26264 1727204239.59176: exiting _queue_task() for managed-node3/command 26264 1727204239.59187: done queuing things up, now waiting for results queue to drain 26264 1727204239.59194: waiting for pending results... 26264 1727204239.59711: running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 26264 1727204239.59828: in run() - task 0affcd87-79f5-5ff5-08b0-0000000000b0 26264 1727204239.59846: variable 'ansible_search_path' from source: unknown 26264 1727204239.59858: variable 'ansible_search_path' from source: unknown 26264 1727204239.59900: calling self._execute() 26264 1727204239.59977: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.59989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.60001: variable 'omit' from source: magic vars 26264 1727204239.60481: variable 'ansible_distribution' from source: facts 26264 1727204239.60499: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 26264 1727204239.60644: variable 'ansible_distribution_major_version' from source: facts 26264 1727204239.60655: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 26264 1727204239.60663: when evaluation is False, skipping this task 26264 1727204239.60671: _execute() done 26264 1727204239.60677: dumping result to json 26264 1727204239.60684: done dumping result, returning 26264 1727204239.60692: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 [0affcd87-79f5-5ff5-08b0-0000000000b0] 26264 1727204239.60701: sending task result for task 0affcd87-79f5-5ff5-08b0-0000000000b0 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 26264 1727204239.60852: no more pending results, returning what we have 26264 1727204239.60857: results queue empty 26264 1727204239.60857: checking for any_errors_fatal 26264 1727204239.60863: done checking for any_errors_fatal 26264 1727204239.60865: checking for max_fail_percentage 26264 1727204239.60867: done checking for max_fail_percentage 26264 1727204239.60868: checking to see if all hosts have failed and the running result is not ok 26264 1727204239.60869: done checking to see if all hosts have failed 26264 1727204239.60870: getting the remaining hosts for this loop 26264 1727204239.60871: done getting the remaining hosts for this loop 26264 1727204239.60876: getting the next task for host managed-node3 26264 1727204239.60885: done getting next task for host managed-node3 26264 1727204239.60888: ^ task is: TASK: Enable EPEL 6 26264 1727204239.60892: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204239.60895: getting variables 26264 1727204239.60897: in VariableManager get_vars() 26264 1727204239.60998: Calling all_inventory to load vars for managed-node3 26264 1727204239.61001: Calling groups_inventory to load vars for managed-node3 26264 1727204239.61005: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.61018: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.61021: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.61024: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.61192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.61386: done with get_vars() 26264 1727204239.61393: done getting variables 26264 1727204239.61469: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.028) 0:00:03.464 ***** 26264 1727204239.61499: entering _queue_task() for managed-node3/copy 26264 1727204239.61515: done sending task result for task 0affcd87-79f5-5ff5-08b0-0000000000b0 26264 1727204239.61531: WORKER PROCESS EXITING 26264 1727204239.62094: worker is 1 (out of 1 available) 26264 1727204239.62107: exiting _queue_task() for managed-node3/copy 26264 1727204239.62140: done queuing things up, now waiting for results queue to drain 26264 1727204239.62142: waiting for pending results... 26264 1727204239.62439: running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 26264 1727204239.62567: in run() - task 0affcd87-79f5-5ff5-08b0-0000000000b2 26264 1727204239.62590: variable 'ansible_search_path' from source: unknown 26264 1727204239.62598: variable 'ansible_search_path' from source: unknown 26264 1727204239.62644: calling self._execute() 26264 1727204239.62734: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.62752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.62769: variable 'omit' from source: magic vars 26264 1727204239.63192: variable 'ansible_distribution' from source: facts 26264 1727204239.63218: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 26264 1727204239.63370: variable 'ansible_distribution_major_version' from source: facts 26264 1727204239.63388: Evaluated conditional (ansible_distribution_major_version == '6'): False 26264 1727204239.63407: when evaluation is False, skipping this task 26264 1727204239.63417: _execute() done 26264 1727204239.63425: dumping result to json 26264 1727204239.63431: done dumping result, returning 26264 1727204239.63439: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 [0affcd87-79f5-5ff5-08b0-0000000000b2] 26264 1727204239.63448: sending task result for task 0affcd87-79f5-5ff5-08b0-0000000000b2 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 26264 1727204239.63611: no more pending results, returning what we have 26264 1727204239.63616: results queue empty 26264 1727204239.63616: checking for any_errors_fatal 26264 1727204239.63620: done checking for any_errors_fatal 26264 1727204239.63621: checking for max_fail_percentage 26264 1727204239.63623: done checking for max_fail_percentage 26264 1727204239.63623: checking to see if all hosts have failed and the running result is not ok 26264 1727204239.63625: done checking to see if all hosts have failed 26264 1727204239.63625: getting the remaining hosts for this loop 26264 1727204239.63628: done getting the remaining hosts for this loop 26264 1727204239.63632: getting the next task for host managed-node3 26264 1727204239.63640: done getting next task for host managed-node3 26264 1727204239.63643: ^ task is: TASK: Set network provider to 'nm' 26264 1727204239.63645: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204239.63650: getting variables 26264 1727204239.63652: in VariableManager get_vars() 26264 1727204239.63685: Calling all_inventory to load vars for managed-node3 26264 1727204239.63688: Calling groups_inventory to load vars for managed-node3 26264 1727204239.63692: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.63705: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.63708: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.63711: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.63940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.64157: done with get_vars() 26264 1727204239.64169: done getting variables 26264 1727204239.64256: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:13 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.027) 0:00:03.492 ***** 26264 1727204239.64292: entering _queue_task() for managed-node3/set_fact 26264 1727204239.64309: done sending task result for task 0affcd87-79f5-5ff5-08b0-0000000000b2 26264 1727204239.64336: WORKER PROCESS EXITING 26264 1727204239.64831: worker is 1 (out of 1 available) 26264 1727204239.64843: exiting _queue_task() for managed-node3/set_fact 26264 1727204239.64855: done queuing things up, now waiting for results queue to drain 26264 1727204239.64857: waiting for pending results... 26264 1727204239.65157: running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' 26264 1727204239.65258: in run() - task 0affcd87-79f5-5ff5-08b0-000000000007 26264 1727204239.65279: variable 'ansible_search_path' from source: unknown 26264 1727204239.65352: calling self._execute() 26264 1727204239.65638: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.65653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.65678: variable 'omit' from source: magic vars 26264 1727204239.65813: variable 'omit' from source: magic vars 26264 1727204239.65858: variable 'omit' from source: magic vars 26264 1727204239.65925: variable 'omit' from source: magic vars 26264 1727204239.65985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204239.66132: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204239.66157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204239.66191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204239.66208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204239.66243: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204239.66252: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.66259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.66373: Set connection var ansible_pipelining to False 26264 1727204239.66381: Set connection var ansible_connection to ssh 26264 1727204239.66388: Set connection var ansible_shell_type to sh 26264 1727204239.66399: Set connection var ansible_shell_executable to /bin/sh 26264 1727204239.66419: Set connection var ansible_timeout to 10 26264 1727204239.66431: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204239.66458: variable 'ansible_shell_executable' from source: unknown 26264 1727204239.66468: variable 'ansible_connection' from source: unknown 26264 1727204239.66475: variable 'ansible_module_compression' from source: unknown 26264 1727204239.66481: variable 'ansible_shell_type' from source: unknown 26264 1727204239.66487: variable 'ansible_shell_executable' from source: unknown 26264 1727204239.66493: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.66500: variable 'ansible_pipelining' from source: unknown 26264 1727204239.66506: variable 'ansible_timeout' from source: unknown 26264 1727204239.66522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.66668: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204239.66684: variable 'omit' from source: magic vars 26264 1727204239.66694: starting attempt loop 26264 1727204239.66700: running the handler 26264 1727204239.66715: handler run complete 26264 1727204239.66739: attempt loop complete, returning result 26264 1727204239.66746: _execute() done 26264 1727204239.66752: dumping result to json 26264 1727204239.66759: done dumping result, returning 26264 1727204239.66771: done running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' [0affcd87-79f5-5ff5-08b0-000000000007] 26264 1727204239.66779: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000007 ok: [managed-node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 26264 1727204239.66968: no more pending results, returning what we have 26264 1727204239.66971: results queue empty 26264 1727204239.66972: checking for any_errors_fatal 26264 1727204239.66979: done checking for any_errors_fatal 26264 1727204239.66980: checking for max_fail_percentage 26264 1727204239.66982: done checking for max_fail_percentage 26264 1727204239.66983: checking to see if all hosts have failed and the running result is not ok 26264 1727204239.66984: done checking to see if all hosts have failed 26264 1727204239.66984: getting the remaining hosts for this loop 26264 1727204239.66986: done getting the remaining hosts for this loop 26264 1727204239.66990: getting the next task for host managed-node3 26264 1727204239.66997: done getting next task for host managed-node3 26264 1727204239.66998: ^ task is: TASK: meta (flush_handlers) 26264 1727204239.67000: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204239.67005: getting variables 26264 1727204239.67007: in VariableManager get_vars() 26264 1727204239.67033: Calling all_inventory to load vars for managed-node3 26264 1727204239.67036: Calling groups_inventory to load vars for managed-node3 26264 1727204239.67039: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.67050: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.67053: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.67056: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.67232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.67445: done with get_vars() 26264 1727204239.67455: done getting variables 26264 1727204239.67619: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000007 26264 1727204239.67622: WORKER PROCESS EXITING 26264 1727204239.67654: in VariableManager get_vars() 26264 1727204239.67663: Calling all_inventory to load vars for managed-node3 26264 1727204239.67667: Calling groups_inventory to load vars for managed-node3 26264 1727204239.67669: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.67674: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.67677: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.67679: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.67980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.68184: done with get_vars() 26264 1727204239.68199: done queuing things up, now waiting for results queue to drain 26264 1727204239.68201: results queue empty 26264 1727204239.68202: checking for any_errors_fatal 26264 1727204239.68204: done checking for any_errors_fatal 26264 1727204239.68205: checking for max_fail_percentage 26264 1727204239.68206: done checking for max_fail_percentage 26264 1727204239.68207: checking to see if all hosts have failed and the running result is not ok 26264 1727204239.68208: done checking to see if all hosts have failed 26264 1727204239.68209: getting the remaining hosts for this loop 26264 1727204239.68210: done getting the remaining hosts for this loop 26264 1727204239.68212: getting the next task for host managed-node3 26264 1727204239.68216: done getting next task for host managed-node3 26264 1727204239.68218: ^ task is: TASK: meta (flush_handlers) 26264 1727204239.68219: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204239.68228: getting variables 26264 1727204239.68229: in VariableManager get_vars() 26264 1727204239.68236: Calling all_inventory to load vars for managed-node3 26264 1727204239.68238: Calling groups_inventory to load vars for managed-node3 26264 1727204239.68240: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.68245: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.68247: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.68250: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.68423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.68624: done with get_vars() 26264 1727204239.68632: done getting variables 26264 1727204239.68679: in VariableManager get_vars() 26264 1727204239.68687: Calling all_inventory to load vars for managed-node3 26264 1727204239.68689: Calling groups_inventory to load vars for managed-node3 26264 1727204239.68696: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.68700: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.68706: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.68709: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.68853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.69054: done with get_vars() 26264 1727204239.69067: done queuing things up, now waiting for results queue to drain 26264 1727204239.69069: results queue empty 26264 1727204239.69070: checking for any_errors_fatal 26264 1727204239.69071: done checking for any_errors_fatal 26264 1727204239.69072: checking for max_fail_percentage 26264 1727204239.69073: done checking for max_fail_percentage 26264 1727204239.69074: checking to see if all hosts have failed and the running result is not ok 26264 1727204239.69074: done checking to see if all hosts have failed 26264 1727204239.69075: getting the remaining hosts for this loop 26264 1727204239.69076: done getting the remaining hosts for this loop 26264 1727204239.69078: getting the next task for host managed-node3 26264 1727204239.69082: done getting next task for host managed-node3 26264 1727204239.69082: ^ task is: None 26264 1727204239.69084: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204239.69085: done queuing things up, now waiting for results queue to drain 26264 1727204239.69086: results queue empty 26264 1727204239.69086: checking for any_errors_fatal 26264 1727204239.69087: done checking for any_errors_fatal 26264 1727204239.69088: checking for max_fail_percentage 26264 1727204239.69089: done checking for max_fail_percentage 26264 1727204239.69089: checking to see if all hosts have failed and the running result is not ok 26264 1727204239.69090: done checking to see if all hosts have failed 26264 1727204239.69092: getting the next task for host managed-node3 26264 1727204239.69094: done getting next task for host managed-node3 26264 1727204239.69095: ^ task is: None 26264 1727204239.69096: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204239.69158: in VariableManager get_vars() 26264 1727204239.69175: done with get_vars() 26264 1727204239.69181: in VariableManager get_vars() 26264 1727204239.69191: done with get_vars() 26264 1727204239.69195: variable 'omit' from source: magic vars 26264 1727204239.69228: in VariableManager get_vars() 26264 1727204239.69239: done with get_vars() 26264 1727204239.69275: variable 'omit' from source: magic vars PLAY [Play for showing the network provider] *********************************** 26264 1727204239.69470: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 26264 1727204239.69498: getting the remaining hosts for this loop 26264 1727204239.69500: done getting the remaining hosts for this loop 26264 1727204239.69502: getting the next task for host managed-node3 26264 1727204239.69505: done getting next task for host managed-node3 26264 1727204239.69507: ^ task is: TASK: Gathering Facts 26264 1727204239.69509: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204239.69511: getting variables 26264 1727204239.69512: in VariableManager get_vars() 26264 1727204239.69519: Calling all_inventory to load vars for managed-node3 26264 1727204239.69521: Calling groups_inventory to load vars for managed-node3 26264 1727204239.69524: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204239.69529: Calling all_plugins_play to load vars for managed-node3 26264 1727204239.69543: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204239.69546: Calling groups_plugins_play to load vars for managed-node3 26264 1727204239.69726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204239.69919: done with get_vars() 26264 1727204239.69927: done getting variables 26264 1727204239.69969: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.057) 0:00:03.549 ***** 26264 1727204239.70001: entering _queue_task() for managed-node3/gather_facts 26264 1727204239.70291: worker is 1 (out of 1 available) 26264 1727204239.70305: exiting _queue_task() for managed-node3/gather_facts 26264 1727204239.70326: done queuing things up, now waiting for results queue to drain 26264 1727204239.70328: waiting for pending results... 26264 1727204239.70615: running TaskExecutor() for managed-node3/TASK: Gathering Facts 26264 1727204239.70750: in run() - task 0affcd87-79f5-5ff5-08b0-0000000000d8 26264 1727204239.70778: variable 'ansible_search_path' from source: unknown 26264 1727204239.70821: calling self._execute() 26264 1727204239.70938: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.70949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.70963: variable 'omit' from source: magic vars 26264 1727204239.71397: variable 'ansible_distribution_major_version' from source: facts 26264 1727204239.71439: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204239.71451: variable 'omit' from source: magic vars 26264 1727204239.71482: variable 'omit' from source: magic vars 26264 1727204239.71529: variable 'omit' from source: magic vars 26264 1727204239.71580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204239.71634: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204239.71669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204239.71694: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204239.71712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204239.71757: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204239.71769: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.71777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.71898: Set connection var ansible_pipelining to False 26264 1727204239.71907: Set connection var ansible_connection to ssh 26264 1727204239.71913: Set connection var ansible_shell_type to sh 26264 1727204239.71924: Set connection var ansible_shell_executable to /bin/sh 26264 1727204239.71943: Set connection var ansible_timeout to 10 26264 1727204239.71961: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204239.72003: variable 'ansible_shell_executable' from source: unknown 26264 1727204239.72018: variable 'ansible_connection' from source: unknown 26264 1727204239.72027: variable 'ansible_module_compression' from source: unknown 26264 1727204239.72035: variable 'ansible_shell_type' from source: unknown 26264 1727204239.72041: variable 'ansible_shell_executable' from source: unknown 26264 1727204239.72047: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204239.72059: variable 'ansible_pipelining' from source: unknown 26264 1727204239.72070: variable 'ansible_timeout' from source: unknown 26264 1727204239.72084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204239.72316: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204239.72332: variable 'omit' from source: magic vars 26264 1727204239.72342: starting attempt loop 26264 1727204239.72349: running the handler 26264 1727204239.72371: variable 'ansible_facts' from source: unknown 26264 1727204239.72416: _low_level_execute_command(): starting 26264 1727204239.72431: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204239.73383: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204239.73404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204239.73423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204239.73441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204239.73487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204239.73499: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204239.73519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204239.73547: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204239.73561: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204239.73575: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204239.73587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204239.73601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204239.73616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204239.73633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204239.73649: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204239.73663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204239.73730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204239.73762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204239.73783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204239.73909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204239.76411: stdout chunk (state=3): >>>/root <<< 26264 1727204239.76671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204239.76675: stdout chunk (state=3): >>><<< 26264 1727204239.76693: stderr chunk (state=3): >>><<< 26264 1727204239.76816: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204239.76819: _low_level_execute_command(): starting 26264 1727204239.76822: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204239.7671542-26485-117693108693823 `" && echo ansible-tmp-1727204239.7671542-26485-117693108693823="` echo /root/.ansible/tmp/ansible-tmp-1727204239.7671542-26485-117693108693823 `" ) && sleep 0' 26264 1727204239.77539: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204239.77558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204239.77593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204239.77612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204239.77660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204239.77676: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204239.77697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204239.77716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204239.77730: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204239.77753: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204239.77786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204239.77811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204239.77902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204239.77921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204239.77934: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204239.77952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204239.78046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204239.78075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204239.78094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204239.78182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204239.80874: stdout chunk (state=3): >>>ansible-tmp-1727204239.7671542-26485-117693108693823=/root/.ansible/tmp/ansible-tmp-1727204239.7671542-26485-117693108693823 <<< 26264 1727204239.81051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204239.81162: stderr chunk (state=3): >>><<< 26264 1727204239.81177: stdout chunk (state=3): >>><<< 26264 1727204239.81470: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204239.7671542-26485-117693108693823=/root/.ansible/tmp/ansible-tmp-1727204239.7671542-26485-117693108693823 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204239.81474: variable 'ansible_module_compression' from source: unknown 26264 1727204239.81476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 26264 1727204239.81478: variable 'ansible_facts' from source: unknown 26264 1727204239.81509: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204239.7671542-26485-117693108693823/AnsiballZ_setup.py 26264 1727204239.81695: Sending initial data 26264 1727204239.81701: Sent initial data (154 bytes) 26264 1727204239.82807: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204239.82827: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204239.82842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204239.82867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204239.82921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204239.82935: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204239.82951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204239.82971: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204239.82982: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204239.82992: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204239.83003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204239.83017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204239.83043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204239.83059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204239.83071: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204239.83085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204239.83174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204239.83196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204239.83211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204239.83296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204239.85860: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204239.85907: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204239.85955: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmptn9m5o5n /root/.ansible/tmp/ansible-tmp-1727204239.7671542-26485-117693108693823/AnsiballZ_setup.py <<< 26264 1727204239.85998: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204239.88591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204239.88819: stderr chunk (state=3): >>><<< 26264 1727204239.88823: stdout chunk (state=3): >>><<< 26264 1727204239.88826: done transferring module to remote 26264 1727204239.88833: _low_level_execute_command(): starting 26264 1727204239.88836: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204239.7671542-26485-117693108693823/ /root/.ansible/tmp/ansible-tmp-1727204239.7671542-26485-117693108693823/AnsiballZ_setup.py && sleep 0' 26264 1727204239.89795: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204239.89808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204239.89822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204239.89846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204239.89894: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204239.89906: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204239.89918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204239.89937: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204239.89959: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204239.89973: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204239.89985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204239.89997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204239.90012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204239.90023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204239.90034: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204239.90046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204239.90136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204239.90162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204239.90190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204239.90277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204239.93001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204239.93005: stdout chunk (state=3): >>><<< 26264 1727204239.93008: stderr chunk (state=3): >>><<< 26264 1727204239.93133: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204239.93137: _low_level_execute_command(): starting 26264 1727204239.93140: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204239.7671542-26485-117693108693823/AnsiballZ_setup.py && sleep 0' 26264 1727204239.93962: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204239.93980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204239.93999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204239.94029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204239.94085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204239.94098: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204239.94124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204239.94147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204239.94165: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204239.94178: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204239.94191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204239.94206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204239.94228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204239.94253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204239.94268: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204239.94284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204239.94376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204239.94399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204239.94417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204239.94521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204240.60924: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "20", "epoch": "1727204240", "epoch_int": "1727204240", "date": "2024-09-24", "time": "14:57:20", "iso8601_micro": "2024-09-24T18:57:20.315686Z", "iso8601": "2024-09-24T18:57:20Z", "iso8601_basic": "20240924T145720315686", "iso8601_basic_short": "20240924T145720", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:19<<< 26264 1727204240.60947: stdout chunk (state=3): >>>2M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2792, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 740, "free": 2792}, "nocache": {"free": 3251, "used": 281}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 586, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0<<< 26264 1727204240.60954: stdout chunk (state=3): >>>, "size_total": 268367278080, "size_available": 264279797760, "block_size": 4096, "block_total": 65519355, "block_available": 64521435, "block_used": 997920, "inode_total": 131071472, "inode_available": 130998308, "inode_used": 73164, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_loadavg": {"1m": 0.47, "5m": 0.37, "15m": 0.19}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 26264 1727204240.63226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204240.63231: stdout chunk (state=3): >>><<< 26264 1727204240.63233: stderr chunk (state=3): >>><<< 26264 1727204240.63386: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "20", "epoch": "1727204240", "epoch_int": "1727204240", "date": "2024-09-24", "time": "14:57:20", "iso8601_micro": "2024-09-24T18:57:20.315686Z", "iso8601": "2024-09-24T18:57:20Z", "iso8601_basic": "20240924T145720315686", "iso8601_basic_short": "20240924T145720", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2792, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 740, "free": 2792}, "nocache": {"free": 3251, "used": 281}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 586, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264279797760, "block_size": 4096, "block_total": 65519355, "block_available": 64521435, "block_used": 997920, "inode_total": 131071472, "inode_available": 130998308, "inode_used": 73164, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_loadavg": {"1m": 0.47, "5m": 0.37, "15m": 0.19}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204240.63687: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204239.7671542-26485-117693108693823/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204240.63902: _low_level_execute_command(): starting 26264 1727204240.63914: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204239.7671542-26485-117693108693823/ > /dev/null 2>&1 && sleep 0' 26264 1727204240.64592: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204240.65131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204240.65153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204240.65188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204240.65229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204240.65337: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204240.65355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204240.65367: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204240.65382: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204240.65388: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204240.65395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204240.65403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204240.65413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204240.65420: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204240.65427: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204240.65436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204240.65509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204240.65591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204240.65612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204240.65725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204240.68169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204240.68173: stdout chunk (state=3): >>><<< 26264 1727204240.68175: stderr chunk (state=3): >>><<< 26264 1727204240.68269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204240.68273: handler run complete 26264 1727204240.68470: variable 'ansible_facts' from source: unknown 26264 1727204240.68504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204240.68849: variable 'ansible_facts' from source: unknown 26264 1727204240.68951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204240.69099: attempt loop complete, returning result 26264 1727204240.69110: _execute() done 26264 1727204240.69117: dumping result to json 26264 1727204240.69167: done dumping result, returning 26264 1727204240.69180: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-5ff5-08b0-0000000000d8] 26264 1727204240.69190: sending task result for task 0affcd87-79f5-5ff5-08b0-0000000000d8 ok: [managed-node3] 26264 1727204240.69917: no more pending results, returning what we have 26264 1727204240.69920: results queue empty 26264 1727204240.69921: checking for any_errors_fatal 26264 1727204240.69922: done checking for any_errors_fatal 26264 1727204240.69923: checking for max_fail_percentage 26264 1727204240.69925: done checking for max_fail_percentage 26264 1727204240.69925: checking to see if all hosts have failed and the running result is not ok 26264 1727204240.69927: done checking to see if all hosts have failed 26264 1727204240.69928: getting the remaining hosts for this loop 26264 1727204240.69929: done getting the remaining hosts for this loop 26264 1727204240.69934: getting the next task for host managed-node3 26264 1727204240.69940: done getting next task for host managed-node3 26264 1727204240.69949: ^ task is: TASK: meta (flush_handlers) 26264 1727204240.69956: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204240.69973: getting variables 26264 1727204240.69986: in VariableManager get_vars() 26264 1727204240.70042: Calling all_inventory to load vars for managed-node3 26264 1727204240.70045: Calling groups_inventory to load vars for managed-node3 26264 1727204240.70049: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204240.70061: Calling all_plugins_play to load vars for managed-node3 26264 1727204240.70065: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204240.70068: Calling groups_plugins_play to load vars for managed-node3 26264 1727204240.70271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204240.70603: done with get_vars() 26264 1727204240.70613: done getting variables 26264 1727204240.70967: done sending task result for task 0affcd87-79f5-5ff5-08b0-0000000000d8 26264 1727204240.70971: WORKER PROCESS EXITING 26264 1727204240.71027: in VariableManager get_vars() 26264 1727204240.71036: Calling all_inventory to load vars for managed-node3 26264 1727204240.71038: Calling groups_inventory to load vars for managed-node3 26264 1727204240.71040: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204240.71045: Calling all_plugins_play to load vars for managed-node3 26264 1727204240.71047: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204240.71054: Calling groups_plugins_play to load vars for managed-node3 26264 1727204240.71427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204240.71839: done with get_vars() 26264 1727204240.71858: done queuing things up, now waiting for results queue to drain 26264 1727204240.71860: results queue empty 26264 1727204240.71861: checking for any_errors_fatal 26264 1727204240.71866: done checking for any_errors_fatal 26264 1727204240.71867: checking for max_fail_percentage 26264 1727204240.71868: done checking for max_fail_percentage 26264 1727204240.71869: checking to see if all hosts have failed and the running result is not ok 26264 1727204240.71870: done checking to see if all hosts have failed 26264 1727204240.71870: getting the remaining hosts for this loop 26264 1727204240.71871: done getting the remaining hosts for this loop 26264 1727204240.71874: getting the next task for host managed-node3 26264 1727204240.71877: done getting next task for host managed-node3 26264 1727204240.71879: ^ task is: TASK: Show inside ethernet tests 26264 1727204240.71881: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204240.71883: getting variables 26264 1727204240.71884: in VariableManager get_vars() 26264 1727204240.71892: Calling all_inventory to load vars for managed-node3 26264 1727204240.71894: Calling groups_inventory to load vars for managed-node3 26264 1727204240.71897: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204240.71901: Calling all_plugins_play to load vars for managed-node3 26264 1727204240.71903: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204240.71913: Calling groups_plugins_play to load vars for managed-node3 26264 1727204240.72094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204240.72296: done with get_vars() 26264 1727204240.72305: done getting variables 26264 1727204240.72384: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show inside ethernet tests] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 Tuesday 24 September 2024 14:57:20 -0400 (0:00:01.024) 0:00:04.573 ***** 26264 1727204240.72419: entering _queue_task() for managed-node3/debug 26264 1727204240.72420: Creating lock for debug 26264 1727204240.72737: worker is 1 (out of 1 available) 26264 1727204240.72754: exiting _queue_task() for managed-node3/debug 26264 1727204240.72771: done queuing things up, now waiting for results queue to drain 26264 1727204240.72774: waiting for pending results... 26264 1727204240.72981: running TaskExecutor() for managed-node3/TASK: Show inside ethernet tests 26264 1727204240.73029: in run() - task 0affcd87-79f5-5ff5-08b0-00000000000b 26264 1727204240.73040: variable 'ansible_search_path' from source: unknown 26264 1727204240.73073: calling self._execute() 26264 1727204240.73131: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204240.73135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204240.73144: variable 'omit' from source: magic vars 26264 1727204240.73479: variable 'ansible_distribution_major_version' from source: facts 26264 1727204240.73490: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204240.73496: variable 'omit' from source: magic vars 26264 1727204240.73517: variable 'omit' from source: magic vars 26264 1727204240.73540: variable 'omit' from source: magic vars 26264 1727204240.73577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204240.73621: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204240.73637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204240.73658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204240.73677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204240.73714: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204240.73717: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204240.73719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204240.73817: Set connection var ansible_pipelining to False 26264 1727204240.73827: Set connection var ansible_connection to ssh 26264 1727204240.73832: Set connection var ansible_shell_type to sh 26264 1727204240.73842: Set connection var ansible_shell_executable to /bin/sh 26264 1727204240.73854: Set connection var ansible_timeout to 10 26264 1727204240.73874: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204240.73929: variable 'ansible_shell_executable' from source: unknown 26264 1727204240.73939: variable 'ansible_connection' from source: unknown 26264 1727204240.73946: variable 'ansible_module_compression' from source: unknown 26264 1727204240.73954: variable 'ansible_shell_type' from source: unknown 26264 1727204240.73960: variable 'ansible_shell_executable' from source: unknown 26264 1727204240.73969: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204240.73976: variable 'ansible_pipelining' from source: unknown 26264 1727204240.73983: variable 'ansible_timeout' from source: unknown 26264 1727204240.73989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204240.74134: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204240.74156: variable 'omit' from source: magic vars 26264 1727204240.74168: starting attempt loop 26264 1727204240.74175: running the handler 26264 1727204240.74223: handler run complete 26264 1727204240.74248: attempt loop complete, returning result 26264 1727204240.74257: _execute() done 26264 1727204240.74264: dumping result to json 26264 1727204240.74273: done dumping result, returning 26264 1727204240.74282: done running TaskExecutor() for managed-node3/TASK: Show inside ethernet tests [0affcd87-79f5-5ff5-08b0-00000000000b] 26264 1727204240.74291: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000000b ok: [managed-node3] => {} MSG: Inside ethernet tests 26264 1727204240.74474: no more pending results, returning what we have 26264 1727204240.74478: results queue empty 26264 1727204240.74479: checking for any_errors_fatal 26264 1727204240.74480: done checking for any_errors_fatal 26264 1727204240.74481: checking for max_fail_percentage 26264 1727204240.74483: done checking for max_fail_percentage 26264 1727204240.74484: checking to see if all hosts have failed and the running result is not ok 26264 1727204240.74488: done checking to see if all hosts have failed 26264 1727204240.74510: getting the remaining hosts for this loop 26264 1727204240.74514: done getting the remaining hosts for this loop 26264 1727204240.74518: getting the next task for host managed-node3 26264 1727204240.74524: done getting next task for host managed-node3 26264 1727204240.74527: ^ task is: TASK: Show network_provider 26264 1727204240.74546: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204240.74552: getting variables 26264 1727204240.74554: in VariableManager get_vars() 26264 1727204240.75339: Calling all_inventory to load vars for managed-node3 26264 1727204240.75346: Calling groups_inventory to load vars for managed-node3 26264 1727204240.75352: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204240.75366: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000000b 26264 1727204240.75370: WORKER PROCESS EXITING 26264 1727204240.75380: Calling all_plugins_play to load vars for managed-node3 26264 1727204240.75386: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204240.75390: Calling groups_plugins_play to load vars for managed-node3 26264 1727204240.75600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204240.75791: done with get_vars() 26264 1727204240.75800: done getting variables 26264 1727204240.75864: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.034) 0:00:04.608 ***** 26264 1727204240.75892: entering _queue_task() for managed-node3/debug 26264 1727204240.76131: worker is 1 (out of 1 available) 26264 1727204240.76142: exiting _queue_task() for managed-node3/debug 26264 1727204240.76156: done queuing things up, now waiting for results queue to drain 26264 1727204240.76158: waiting for pending results... 26264 1727204240.76405: running TaskExecutor() for managed-node3/TASK: Show network_provider 26264 1727204240.76507: in run() - task 0affcd87-79f5-5ff5-08b0-00000000000c 26264 1727204240.76531: variable 'ansible_search_path' from source: unknown 26264 1727204240.76575: calling self._execute() 26264 1727204240.76666: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204240.76676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204240.76689: variable 'omit' from source: magic vars 26264 1727204240.76985: variable 'ansible_distribution_major_version' from source: facts 26264 1727204240.76996: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204240.77003: variable 'omit' from source: magic vars 26264 1727204240.77047: variable 'omit' from source: magic vars 26264 1727204240.77099: variable 'omit' from source: magic vars 26264 1727204240.77150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204240.77204: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204240.77235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204240.77258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204240.77293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204240.77328: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204240.77337: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204240.77345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204240.77450: Set connection var ansible_pipelining to False 26264 1727204240.77459: Set connection var ansible_connection to ssh 26264 1727204240.77470: Set connection var ansible_shell_type to sh 26264 1727204240.77481: Set connection var ansible_shell_executable to /bin/sh 26264 1727204240.77492: Set connection var ansible_timeout to 10 26264 1727204240.77507: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204240.77531: variable 'ansible_shell_executable' from source: unknown 26264 1727204240.77538: variable 'ansible_connection' from source: unknown 26264 1727204240.77545: variable 'ansible_module_compression' from source: unknown 26264 1727204240.77552: variable 'ansible_shell_type' from source: unknown 26264 1727204240.77558: variable 'ansible_shell_executable' from source: unknown 26264 1727204240.77566: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204240.77574: variable 'ansible_pipelining' from source: unknown 26264 1727204240.77581: variable 'ansible_timeout' from source: unknown 26264 1727204240.77587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204240.77746: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204240.77767: variable 'omit' from source: magic vars 26264 1727204240.77778: starting attempt loop 26264 1727204240.77785: running the handler 26264 1727204240.77848: variable 'network_provider' from source: set_fact 26264 1727204240.77933: variable 'network_provider' from source: set_fact 26264 1727204240.77966: handler run complete 26264 1727204240.77989: attempt loop complete, returning result 26264 1727204240.77996: _execute() done 26264 1727204240.78002: dumping result to json 26264 1727204240.78009: done dumping result, returning 26264 1727204240.78019: done running TaskExecutor() for managed-node3/TASK: Show network_provider [0affcd87-79f5-5ff5-08b0-00000000000c] 26264 1727204240.78028: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000000c ok: [managed-node3] => { "network_provider": "nm" } 26264 1727204240.78174: no more pending results, returning what we have 26264 1727204240.78177: results queue empty 26264 1727204240.78178: checking for any_errors_fatal 26264 1727204240.78186: done checking for any_errors_fatal 26264 1727204240.78187: checking for max_fail_percentage 26264 1727204240.78188: done checking for max_fail_percentage 26264 1727204240.78189: checking to see if all hosts have failed and the running result is not ok 26264 1727204240.78190: done checking to see if all hosts have failed 26264 1727204240.78191: getting the remaining hosts for this loop 26264 1727204240.78194: done getting the remaining hosts for this loop 26264 1727204240.78197: getting the next task for host managed-node3 26264 1727204240.78204: done getting next task for host managed-node3 26264 1727204240.78206: ^ task is: TASK: meta (flush_handlers) 26264 1727204240.78208: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204240.78211: getting variables 26264 1727204240.78212: in VariableManager get_vars() 26264 1727204240.78239: Calling all_inventory to load vars for managed-node3 26264 1727204240.78242: Calling groups_inventory to load vars for managed-node3 26264 1727204240.78245: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204240.78254: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000000c 26264 1727204240.78258: WORKER PROCESS EXITING 26264 1727204240.78272: Calling all_plugins_play to load vars for managed-node3 26264 1727204240.78275: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204240.78278: Calling groups_plugins_play to load vars for managed-node3 26264 1727204240.78486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204240.78678: done with get_vars() 26264 1727204240.78689: done getting variables 26264 1727204240.78767: in VariableManager get_vars() 26264 1727204240.78777: Calling all_inventory to load vars for managed-node3 26264 1727204240.78779: Calling groups_inventory to load vars for managed-node3 26264 1727204240.78781: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204240.78786: Calling all_plugins_play to load vars for managed-node3 26264 1727204240.78788: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204240.78790: Calling groups_plugins_play to load vars for managed-node3 26264 1727204240.79507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204240.79720: done with get_vars() 26264 1727204240.79733: done queuing things up, now waiting for results queue to drain 26264 1727204240.79735: results queue empty 26264 1727204240.79736: checking for any_errors_fatal 26264 1727204240.79738: done checking for any_errors_fatal 26264 1727204240.79739: checking for max_fail_percentage 26264 1727204240.79740: done checking for max_fail_percentage 26264 1727204240.79741: checking to see if all hosts have failed and the running result is not ok 26264 1727204240.79742: done checking to see if all hosts have failed 26264 1727204240.79742: getting the remaining hosts for this loop 26264 1727204240.79743: done getting the remaining hosts for this loop 26264 1727204240.79746: getting the next task for host managed-node3 26264 1727204240.79757: done getting next task for host managed-node3 26264 1727204240.79759: ^ task is: TASK: meta (flush_handlers) 26264 1727204240.79760: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204240.79763: getting variables 26264 1727204240.79766: in VariableManager get_vars() 26264 1727204240.79774: Calling all_inventory to load vars for managed-node3 26264 1727204240.79776: Calling groups_inventory to load vars for managed-node3 26264 1727204240.79779: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204240.79787: Calling all_plugins_play to load vars for managed-node3 26264 1727204240.79790: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204240.79797: Calling groups_plugins_play to load vars for managed-node3 26264 1727204240.80017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204240.80292: done with get_vars() 26264 1727204240.80339: done getting variables 26264 1727204240.80393: in VariableManager get_vars() 26264 1727204240.80401: Calling all_inventory to load vars for managed-node3 26264 1727204240.80403: Calling groups_inventory to load vars for managed-node3 26264 1727204240.80406: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204240.80410: Calling all_plugins_play to load vars for managed-node3 26264 1727204240.80412: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204240.80415: Calling groups_plugins_play to load vars for managed-node3 26264 1727204240.80879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204240.80995: done with get_vars() 26264 1727204240.81003: done queuing things up, now waiting for results queue to drain 26264 1727204240.81004: results queue empty 26264 1727204240.81005: checking for any_errors_fatal 26264 1727204240.81006: done checking for any_errors_fatal 26264 1727204240.81006: checking for max_fail_percentage 26264 1727204240.81007: done checking for max_fail_percentage 26264 1727204240.81007: checking to see if all hosts have failed and the running result is not ok 26264 1727204240.81008: done checking to see if all hosts have failed 26264 1727204240.81008: getting the remaining hosts for this loop 26264 1727204240.81009: done getting the remaining hosts for this loop 26264 1727204240.81010: getting the next task for host managed-node3 26264 1727204240.81012: done getting next task for host managed-node3 26264 1727204240.81013: ^ task is: None 26264 1727204240.81014: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204240.81015: done queuing things up, now waiting for results queue to drain 26264 1727204240.81015: results queue empty 26264 1727204240.81016: checking for any_errors_fatal 26264 1727204240.81016: done checking for any_errors_fatal 26264 1727204240.81017: checking for max_fail_percentage 26264 1727204240.81017: done checking for max_fail_percentage 26264 1727204240.81018: checking to see if all hosts have failed and the running result is not ok 26264 1727204240.81018: done checking to see if all hosts have failed 26264 1727204240.81019: getting the next task for host managed-node3 26264 1727204240.81021: done getting next task for host managed-node3 26264 1727204240.81021: ^ task is: None 26264 1727204240.81022: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204240.81054: in VariableManager get_vars() 26264 1727204240.81066: done with get_vars() 26264 1727204240.81069: in VariableManager get_vars() 26264 1727204240.81075: done with get_vars() 26264 1727204240.81077: variable 'omit' from source: magic vars 26264 1727204240.81098: in VariableManager get_vars() 26264 1727204240.81104: done with get_vars() 26264 1727204240.81117: variable 'omit' from source: magic vars PLAY [Test configuring ethernet devices] *************************************** 26264 1727204240.81238: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 26264 1727204240.81260: getting the remaining hosts for this loop 26264 1727204240.81261: done getting the remaining hosts for this loop 26264 1727204240.81263: getting the next task for host managed-node3 26264 1727204240.81267: done getting next task for host managed-node3 26264 1727204240.81269: ^ task is: TASK: Gathering Facts 26264 1727204240.81270: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204240.81271: getting variables 26264 1727204240.81272: in VariableManager get_vars() 26264 1727204240.81277: Calling all_inventory to load vars for managed-node3 26264 1727204240.81278: Calling groups_inventory to load vars for managed-node3 26264 1727204240.81280: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204240.81283: Calling all_plugins_play to load vars for managed-node3 26264 1727204240.81284: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204240.81286: Calling groups_plugins_play to load vars for managed-node3 26264 1727204240.81365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204240.81470: done with get_vars() 26264 1727204240.81475: done getting variables 26264 1727204240.81503: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.056) 0:00:04.664 ***** 26264 1727204240.81519: entering _queue_task() for managed-node3/gather_facts 26264 1727204240.81717: worker is 1 (out of 1 available) 26264 1727204240.81730: exiting _queue_task() for managed-node3/gather_facts 26264 1727204240.81741: done queuing things up, now waiting for results queue to drain 26264 1727204240.81743: waiting for pending results... 26264 1727204240.81905: running TaskExecutor() for managed-node3/TASK: Gathering Facts 26264 1727204240.81977: in run() - task 0affcd87-79f5-5ff5-08b0-0000000000f0 26264 1727204240.81989: variable 'ansible_search_path' from source: unknown 26264 1727204240.82020: calling self._execute() 26264 1727204240.82083: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204240.82098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204240.82145: variable 'omit' from source: magic vars 26264 1727204240.82580: variable 'ansible_distribution_major_version' from source: facts 26264 1727204240.82598: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204240.82609: variable 'omit' from source: magic vars 26264 1727204240.82646: variable 'omit' from source: magic vars 26264 1727204240.82695: variable 'omit' from source: magic vars 26264 1727204240.82764: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204240.82813: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204240.82842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204240.82872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204240.82892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204240.82923: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204240.82931: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204240.82941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204240.83052: Set connection var ansible_pipelining to False 26264 1727204240.83065: Set connection var ansible_connection to ssh 26264 1727204240.83077: Set connection var ansible_shell_type to sh 26264 1727204240.83098: Set connection var ansible_shell_executable to /bin/sh 26264 1727204240.83112: Set connection var ansible_timeout to 10 26264 1727204240.83122: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204240.83147: variable 'ansible_shell_executable' from source: unknown 26264 1727204240.83155: variable 'ansible_connection' from source: unknown 26264 1727204240.83161: variable 'ansible_module_compression' from source: unknown 26264 1727204240.83169: variable 'ansible_shell_type' from source: unknown 26264 1727204240.83177: variable 'ansible_shell_executable' from source: unknown 26264 1727204240.83187: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204240.83194: variable 'ansible_pipelining' from source: unknown 26264 1727204240.83203: variable 'ansible_timeout' from source: unknown 26264 1727204240.83214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204240.83417: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204240.83437: variable 'omit' from source: magic vars 26264 1727204240.83447: starting attempt loop 26264 1727204240.83454: running the handler 26264 1727204240.83476: variable 'ansible_facts' from source: unknown 26264 1727204240.83498: _low_level_execute_command(): starting 26264 1727204240.83516: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204240.84328: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204240.84341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204240.84354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204240.84375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204240.84427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204240.84440: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204240.84457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204240.84483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204240.84500: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204240.84515: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204240.84527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204240.84540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204240.84558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204240.84576: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204240.84591: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204240.84611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204240.84698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204240.84722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204240.84738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204240.85168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204240.87340: stdout chunk (state=3): >>>/root <<< 26264 1727204240.87580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204240.87584: stdout chunk (state=3): >>><<< 26264 1727204240.87586: stderr chunk (state=3): >>><<< 26264 1727204240.87693: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204240.87697: _low_level_execute_command(): starting 26264 1727204240.87700: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204240.8760533-26538-240743178785805 `" && echo ansible-tmp-1727204240.8760533-26538-240743178785805="` echo /root/.ansible/tmp/ansible-tmp-1727204240.8760533-26538-240743178785805 `" ) && sleep 0' 26264 1727204240.88256: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204240.88274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204240.88289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204240.88308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204240.88357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204240.88372: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204240.88387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204240.88405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204240.88417: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204240.88428: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204240.88440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204240.88460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204240.88479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204240.88492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204240.88538: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204240.88552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204240.88655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204240.88690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204240.88707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204240.88784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204240.91401: stdout chunk (state=3): >>>ansible-tmp-1727204240.8760533-26538-240743178785805=/root/.ansible/tmp/ansible-tmp-1727204240.8760533-26538-240743178785805 <<< 26264 1727204240.91568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204240.91636: stderr chunk (state=3): >>><<< 26264 1727204240.91639: stdout chunk (state=3): >>><<< 26264 1727204240.91975: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204240.8760533-26538-240743178785805=/root/.ansible/tmp/ansible-tmp-1727204240.8760533-26538-240743178785805 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204240.91979: variable 'ansible_module_compression' from source: unknown 26264 1727204240.91982: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 26264 1727204240.91984: variable 'ansible_facts' from source: unknown 26264 1727204240.92012: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204240.8760533-26538-240743178785805/AnsiballZ_setup.py 26264 1727204240.92288: Sending initial data 26264 1727204240.92290: Sent initial data (154 bytes) 26264 1727204240.93384: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204240.93399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204240.93414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204240.93434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204240.93490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204240.93502: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204240.93516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204240.93534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204240.93550: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204240.93571: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204240.93587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204240.93603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204240.93619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204240.93634: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204240.93646: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204240.93665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204240.93750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204240.93786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204240.93809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204240.93886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204240.96365: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204240.96426: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204240.96508: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpd5n6gur8 /root/.ansible/tmp/ansible-tmp-1727204240.8760533-26538-240743178785805/AnsiballZ_setup.py <<< 26264 1727204240.96577: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204240.99986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204241.00171: stderr chunk (state=3): >>><<< 26264 1727204241.00174: stdout chunk (state=3): >>><<< 26264 1727204241.00176: done transferring module to remote 26264 1727204241.00182: _low_level_execute_command(): starting 26264 1727204241.00185: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204240.8760533-26538-240743178785805/ /root/.ansible/tmp/ansible-tmp-1727204240.8760533-26538-240743178785805/AnsiballZ_setup.py && sleep 0' 26264 1727204241.01373: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204241.01387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204241.01400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204241.01416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204241.01468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204241.01480: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204241.01493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204241.01509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204241.01520: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204241.01529: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204241.01549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204241.01562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204241.01579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204241.01590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204241.01600: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204241.01611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204241.01698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204241.01717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204241.01731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204241.01802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204241.04018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204241.04021: stdout chunk (state=3): >>><<< 26264 1727204241.04024: stderr chunk (state=3): >>><<< 26264 1727204241.04117: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26264 1727204241.04120: _low_level_execute_command(): starting 26264 1727204241.04123: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204240.8760533-26538-240743178785805/AnsiballZ_setup.py && sleep 0' 26264 1727204241.05198: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204241.05211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204241.05233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204241.05252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204241.05296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204241.05308: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204241.05321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204241.05338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204241.05350: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204241.05362: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204241.05376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204241.05388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204241.05402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204241.05413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204241.05422: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204241.05434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204241.05511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204241.05532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204241.05549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204241.05633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26264 1727204241.55607: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "21", "epoch": "1727204241", "epoch_int": "1727204241", "date": "2024-09-24", "time": "14:57:21", "iso8601_micro": "2024-09-24T18:57:21.300276Z", "iso8601": "2024-09-24T18:57:21Z", "iso8601_basic": "20240924T145721300276", "iso8601_basic_short": "20240924T145721", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_loadavg": {"1m": 0.47, "5m": 0.37, "15m": 0.19}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2808, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 724, "free": 2808}, "nocache": {"free": 3267, "used": 265}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 587, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264280027136, "block_size": 4096, "block_total": 65519355, "block_available": 64521491, "block_used": 997864, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 26264 1727204241.57339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204241.57342: stdout chunk (state=3): >>><<< 26264 1727204241.57369: stderr chunk (state=3): >>><<< 26264 1727204241.57467: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "21", "epoch": "1727204241", "epoch_int": "1727204241", "date": "2024-09-24", "time": "14:57:21", "iso8601_micro": "2024-09-24T18:57:21.300276Z", "iso8601": "2024-09-24T18:57:21Z", "iso8601_basic": "20240924T145721300276", "iso8601_basic_short": "20240924T145721", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_loadavg": {"1m": 0.47, "5m": 0.37, "15m": 0.19}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2808, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 724, "free": 2808}, "nocache": {"free": 3267, "used": 265}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 587, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264280027136, "block_size": 4096, "block_total": 65519355, "block_available": 64521491, "block_used": 997864, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204241.57824: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204240.8760533-26538-240743178785805/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204241.57854: _low_level_execute_command(): starting 26264 1727204241.57866: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204240.8760533-26538-240743178785805/ > /dev/null 2>&1 && sleep 0' 26264 1727204241.58716: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204241.58729: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204241.58742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204241.58767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204241.58815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204241.58827: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204241.58840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204241.58859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204241.58872: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204241.58883: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204241.58898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204241.58917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204241.58932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204241.58943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204241.58957: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204241.58971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204241.59068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204241.59094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204241.59118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204241.59291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204241.61221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204241.61225: stdout chunk (state=3): >>><<< 26264 1727204241.61229: stderr chunk (state=3): >>><<< 26264 1727204241.61270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204241.61274: handler run complete 26264 1727204241.61474: variable 'ansible_facts' from source: unknown 26264 1727204241.61513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204241.61857: variable 'ansible_facts' from source: unknown 26264 1727204241.62074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204241.62327: attempt loop complete, returning result 26264 1727204241.62383: _execute() done 26264 1727204241.62390: dumping result to json 26264 1727204241.62425: done dumping result, returning 26264 1727204241.62586: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-5ff5-08b0-0000000000f0] 26264 1727204241.62692: sending task result for task 0affcd87-79f5-5ff5-08b0-0000000000f0 ok: [managed-node3] 26264 1727204241.63545: no more pending results, returning what we have 26264 1727204241.63551: results queue empty 26264 1727204241.63552: checking for any_errors_fatal 26264 1727204241.63554: done checking for any_errors_fatal 26264 1727204241.63554: checking for max_fail_percentage 26264 1727204241.63556: done checking for max_fail_percentage 26264 1727204241.63556: checking to see if all hosts have failed and the running result is not ok 26264 1727204241.63558: done checking to see if all hosts have failed 26264 1727204241.63558: getting the remaining hosts for this loop 26264 1727204241.63560: done getting the remaining hosts for this loop 26264 1727204241.63569: getting the next task for host managed-node3 26264 1727204241.63574: done getting next task for host managed-node3 26264 1727204241.63576: ^ task is: TASK: meta (flush_handlers) 26264 1727204241.63578: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204241.63581: getting variables 26264 1727204241.63582: in VariableManager get_vars() 26264 1727204241.63606: Calling all_inventory to load vars for managed-node3 26264 1727204241.63609: Calling groups_inventory to load vars for managed-node3 26264 1727204241.63612: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204241.63622: Calling all_plugins_play to load vars for managed-node3 26264 1727204241.63625: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204241.63629: Calling groups_plugins_play to load vars for managed-node3 26264 1727204241.63805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204241.64099: done with get_vars() 26264 1727204241.64226: done getting variables 26264 1727204241.64265: done sending task result for task 0affcd87-79f5-5ff5-08b0-0000000000f0 26264 1727204241.64269: WORKER PROCESS EXITING 26264 1727204241.64316: in VariableManager get_vars() 26264 1727204241.64327: Calling all_inventory to load vars for managed-node3 26264 1727204241.64404: Calling groups_inventory to load vars for managed-node3 26264 1727204241.64407: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204241.64413: Calling all_plugins_play to load vars for managed-node3 26264 1727204241.64415: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204241.64425: Calling groups_plugins_play to load vars for managed-node3 26264 1727204241.64576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204241.64951: done with get_vars() 26264 1727204241.64966: done queuing things up, now waiting for results queue to drain 26264 1727204241.64968: results queue empty 26264 1727204241.64969: checking for any_errors_fatal 26264 1727204241.64973: done checking for any_errors_fatal 26264 1727204241.64973: checking for max_fail_percentage 26264 1727204241.64974: done checking for max_fail_percentage 26264 1727204241.64975: checking to see if all hosts have failed and the running result is not ok 26264 1727204241.64976: done checking to see if all hosts have failed 26264 1727204241.64976: getting the remaining hosts for this loop 26264 1727204241.64977: done getting the remaining hosts for this loop 26264 1727204241.64980: getting the next task for host managed-node3 26264 1727204241.65096: done getting next task for host managed-node3 26264 1727204241.65099: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 26264 1727204241.65100: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204241.65103: getting variables 26264 1727204241.65104: in VariableManager get_vars() 26264 1727204241.65112: Calling all_inventory to load vars for managed-node3 26264 1727204241.65114: Calling groups_inventory to load vars for managed-node3 26264 1727204241.65116: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204241.65121: Calling all_plugins_play to load vars for managed-node3 26264 1727204241.65123: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204241.65125: Calling groups_plugins_play to load vars for managed-node3 26264 1727204241.65449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204241.65659: done with get_vars() 26264 1727204241.65669: done getting variables 26264 1727204241.65707: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 26264 1727204241.65866: variable 'type' from source: play vars 26264 1727204241.65872: variable 'interface' from source: play vars TASK [Set type=veth and interface=lsr27] *************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:20 Tuesday 24 September 2024 14:57:21 -0400 (0:00:00.843) 0:00:05.508 ***** 26264 1727204241.65909: entering _queue_task() for managed-node3/set_fact 26264 1727204241.66418: worker is 1 (out of 1 available) 26264 1727204241.66430: exiting _queue_task() for managed-node3/set_fact 26264 1727204241.66441: done queuing things up, now waiting for results queue to drain 26264 1727204241.66442: waiting for pending results... 26264 1727204241.67398: running TaskExecutor() for managed-node3/TASK: Set type=veth and interface=lsr27 26264 1727204241.67508: in run() - task 0affcd87-79f5-5ff5-08b0-00000000000f 26264 1727204241.67531: variable 'ansible_search_path' from source: unknown 26264 1727204241.67576: calling self._execute() 26264 1727204241.67770: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204241.67780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204241.67800: variable 'omit' from source: magic vars 26264 1727204241.68189: variable 'ansible_distribution_major_version' from source: facts 26264 1727204241.68207: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204241.68218: variable 'omit' from source: magic vars 26264 1727204241.68261: variable 'omit' from source: magic vars 26264 1727204241.68301: variable 'type' from source: play vars 26264 1727204241.68398: variable 'type' from source: play vars 26264 1727204241.68413: variable 'interface' from source: play vars 26264 1727204241.68490: variable 'interface' from source: play vars 26264 1727204241.68518: variable 'omit' from source: magic vars 26264 1727204241.68571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204241.68623: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204241.68654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204241.68685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204241.68706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204241.68750: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204241.68759: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204241.68769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204241.69030: Set connection var ansible_pipelining to False 26264 1727204241.69038: Set connection var ansible_connection to ssh 26264 1727204241.69879: Set connection var ansible_shell_type to sh 26264 1727204241.69890: Set connection var ansible_shell_executable to /bin/sh 26264 1727204241.69901: Set connection var ansible_timeout to 10 26264 1727204241.69912: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204241.69939: variable 'ansible_shell_executable' from source: unknown 26264 1727204241.69945: variable 'ansible_connection' from source: unknown 26264 1727204241.69954: variable 'ansible_module_compression' from source: unknown 26264 1727204241.69960: variable 'ansible_shell_type' from source: unknown 26264 1727204241.69968: variable 'ansible_shell_executable' from source: unknown 26264 1727204241.69976: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204241.69988: variable 'ansible_pipelining' from source: unknown 26264 1727204241.69994: variable 'ansible_timeout' from source: unknown 26264 1727204241.70001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204241.70633: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204241.70653: variable 'omit' from source: magic vars 26264 1727204241.70666: starting attempt loop 26264 1727204241.70673: running the handler 26264 1727204241.70694: handler run complete 26264 1727204241.70707: attempt loop complete, returning result 26264 1727204241.70712: _execute() done 26264 1727204241.70718: dumping result to json 26264 1727204241.70743: done dumping result, returning 26264 1727204241.70756: done running TaskExecutor() for managed-node3/TASK: Set type=veth and interface=lsr27 [0affcd87-79f5-5ff5-08b0-00000000000f] 26264 1727204241.70772: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000000f ok: [managed-node3] => { "ansible_facts": { "interface": "lsr27", "type": "veth" }, "changed": false } 26264 1727204241.71300: no more pending results, returning what we have 26264 1727204241.71304: results queue empty 26264 1727204241.71305: checking for any_errors_fatal 26264 1727204241.71306: done checking for any_errors_fatal 26264 1727204241.71307: checking for max_fail_percentage 26264 1727204241.71308: done checking for max_fail_percentage 26264 1727204241.71309: checking to see if all hosts have failed and the running result is not ok 26264 1727204241.71310: done checking to see if all hosts have failed 26264 1727204241.71311: getting the remaining hosts for this loop 26264 1727204241.71313: done getting the remaining hosts for this loop 26264 1727204241.71318: getting the next task for host managed-node3 26264 1727204241.71323: done getting next task for host managed-node3 26264 1727204241.71326: ^ task is: TASK: Include the task 'show_interfaces.yml' 26264 1727204241.71328: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204241.71331: getting variables 26264 1727204241.71333: in VariableManager get_vars() 26264 1727204241.71359: Calling all_inventory to load vars for managed-node3 26264 1727204241.71362: Calling groups_inventory to load vars for managed-node3 26264 1727204241.71372: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204241.71383: Calling all_plugins_play to load vars for managed-node3 26264 1727204241.71385: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204241.71388: Calling groups_plugins_play to load vars for managed-node3 26264 1727204241.71566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204241.71776: done with get_vars() 26264 1727204241.71786: done getting variables 26264 1727204241.71954: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000000f 26264 1727204241.71957: WORKER PROCESS EXITING TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:24 Tuesday 24 September 2024 14:57:21 -0400 (0:00:00.060) 0:00:05.569 ***** 26264 1727204241.72013: entering _queue_task() for managed-node3/include_tasks 26264 1727204241.72468: worker is 1 (out of 1 available) 26264 1727204241.72478: exiting _queue_task() for managed-node3/include_tasks 26264 1727204241.72490: done queuing things up, now waiting for results queue to drain 26264 1727204241.72492: waiting for pending results... 26264 1727204241.72800: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 26264 1727204241.72917: in run() - task 0affcd87-79f5-5ff5-08b0-000000000010 26264 1727204241.72933: variable 'ansible_search_path' from source: unknown 26264 1727204241.73014: calling self._execute() 26264 1727204241.73118: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204241.73129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204241.73142: variable 'omit' from source: magic vars 26264 1727204241.73534: variable 'ansible_distribution_major_version' from source: facts 26264 1727204241.73554: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204241.73566: _execute() done 26264 1727204241.73574: dumping result to json 26264 1727204241.73581: done dumping result, returning 26264 1727204241.73590: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-5ff5-08b0-000000000010] 26264 1727204241.73599: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000010 26264 1727204241.73706: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000010 26264 1727204241.73738: no more pending results, returning what we have 26264 1727204241.73743: in VariableManager get_vars() 26264 1727204241.73784: Calling all_inventory to load vars for managed-node3 26264 1727204241.73787: Calling groups_inventory to load vars for managed-node3 26264 1727204241.73791: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204241.73806: Calling all_plugins_play to load vars for managed-node3 26264 1727204241.73810: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204241.73813: Calling groups_plugins_play to load vars for managed-node3 26264 1727204241.73977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204241.74205: done with get_vars() 26264 1727204241.74212: variable 'ansible_search_path' from source: unknown 26264 1727204241.74229: we have included files to process 26264 1727204241.74230: generating all_blocks data 26264 1727204241.74232: done generating all_blocks data 26264 1727204241.74233: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 26264 1727204241.74234: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 26264 1727204241.74237: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 26264 1727204241.74554: in VariableManager get_vars() 26264 1727204241.74572: done with get_vars() 26264 1727204241.74751: WORKER PROCESS EXITING 26264 1727204241.74859: done processing included file 26264 1727204241.74861: iterating over new_blocks loaded from include file 26264 1727204241.74863: in VariableManager get_vars() 26264 1727204241.74876: done with get_vars() 26264 1727204241.74878: filtering new block on tags 26264 1727204241.74895: done filtering new block on tags 26264 1727204241.74897: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 26264 1727204241.74902: extending task lists for all hosts with included blocks 26264 1727204241.74988: done extending task lists 26264 1727204241.74989: done processing included files 26264 1727204241.74990: results queue empty 26264 1727204241.74991: checking for any_errors_fatal 26264 1727204241.74994: done checking for any_errors_fatal 26264 1727204241.74995: checking for max_fail_percentage 26264 1727204241.74996: done checking for max_fail_percentage 26264 1727204241.74996: checking to see if all hosts have failed and the running result is not ok 26264 1727204241.74997: done checking to see if all hosts have failed 26264 1727204241.74998: getting the remaining hosts for this loop 26264 1727204241.74999: done getting the remaining hosts for this loop 26264 1727204241.75001: getting the next task for host managed-node3 26264 1727204241.75005: done getting next task for host managed-node3 26264 1727204241.75007: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 26264 1727204241.75009: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204241.75011: getting variables 26264 1727204241.75012: in VariableManager get_vars() 26264 1727204241.75019: Calling all_inventory to load vars for managed-node3 26264 1727204241.75021: Calling groups_inventory to load vars for managed-node3 26264 1727204241.75023: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204241.75028: Calling all_plugins_play to load vars for managed-node3 26264 1727204241.75030: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204241.75033: Calling groups_plugins_play to load vars for managed-node3 26264 1727204241.75173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204241.75351: done with get_vars() 26264 1727204241.75470: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:57:21 -0400 (0:00:00.035) 0:00:05.604 ***** 26264 1727204241.75541: entering _queue_task() for managed-node3/include_tasks 26264 1727204241.76125: worker is 1 (out of 1 available) 26264 1727204241.76138: exiting _queue_task() for managed-node3/include_tasks 26264 1727204241.76154: done queuing things up, now waiting for results queue to drain 26264 1727204241.76156: waiting for pending results... 26264 1727204241.76954: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 26264 1727204241.77051: in run() - task 0affcd87-79f5-5ff5-08b0-000000000104 26264 1727204241.77071: variable 'ansible_search_path' from source: unknown 26264 1727204241.77079: variable 'ansible_search_path' from source: unknown 26264 1727204241.77124: calling self._execute() 26264 1727204241.77210: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204241.77221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204241.77232: variable 'omit' from source: magic vars 26264 1727204241.77684: variable 'ansible_distribution_major_version' from source: facts 26264 1727204241.77701: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204241.77712: _execute() done 26264 1727204241.77720: dumping result to json 26264 1727204241.77728: done dumping result, returning 26264 1727204241.77742: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-5ff5-08b0-000000000104] 26264 1727204241.77757: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000104 26264 1727204241.77877: no more pending results, returning what we have 26264 1727204241.77882: in VariableManager get_vars() 26264 1727204241.77913: Calling all_inventory to load vars for managed-node3 26264 1727204241.77916: Calling groups_inventory to load vars for managed-node3 26264 1727204241.77919: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204241.77933: Calling all_plugins_play to load vars for managed-node3 26264 1727204241.77936: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204241.77940: Calling groups_plugins_play to load vars for managed-node3 26264 1727204241.78171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204241.78358: done with get_vars() 26264 1727204241.78368: variable 'ansible_search_path' from source: unknown 26264 1727204241.78369: variable 'ansible_search_path' from source: unknown 26264 1727204241.78414: we have included files to process 26264 1727204241.78416: generating all_blocks data 26264 1727204241.78417: done generating all_blocks data 26264 1727204241.78418: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 26264 1727204241.78420: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 26264 1727204241.78422: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 26264 1727204241.78821: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000104 26264 1727204241.78825: WORKER PROCESS EXITING 26264 1727204241.79030: done processing included file 26264 1727204241.79032: iterating over new_blocks loaded from include file 26264 1727204241.79034: in VariableManager get_vars() 26264 1727204241.79045: done with get_vars() 26264 1727204241.79049: filtering new block on tags 26264 1727204241.79068: done filtering new block on tags 26264 1727204241.79070: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 26264 1727204241.79075: extending task lists for all hosts with included blocks 26264 1727204241.79179: done extending task lists 26264 1727204241.79181: done processing included files 26264 1727204241.79182: results queue empty 26264 1727204241.79182: checking for any_errors_fatal 26264 1727204241.79185: done checking for any_errors_fatal 26264 1727204241.79186: checking for max_fail_percentage 26264 1727204241.79187: done checking for max_fail_percentage 26264 1727204241.79187: checking to see if all hosts have failed and the running result is not ok 26264 1727204241.79188: done checking to see if all hosts have failed 26264 1727204241.79189: getting the remaining hosts for this loop 26264 1727204241.79190: done getting the remaining hosts for this loop 26264 1727204241.79193: getting the next task for host managed-node3 26264 1727204241.79197: done getting next task for host managed-node3 26264 1727204241.79199: ^ task is: TASK: Gather current interface info 26264 1727204241.79202: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204241.79204: getting variables 26264 1727204241.79205: in VariableManager get_vars() 26264 1727204241.79213: Calling all_inventory to load vars for managed-node3 26264 1727204241.79215: Calling groups_inventory to load vars for managed-node3 26264 1727204241.79217: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204241.79221: Calling all_plugins_play to load vars for managed-node3 26264 1727204241.79224: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204241.79228: Calling groups_plugins_play to load vars for managed-node3 26264 1727204241.79379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204241.79600: done with get_vars() 26264 1727204241.79608: done getting variables 26264 1727204241.79635: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:57:21 -0400 (0:00:00.041) 0:00:05.645 ***** 26264 1727204241.79657: entering _queue_task() for managed-node3/command 26264 1727204241.79844: worker is 1 (out of 1 available) 26264 1727204241.79860: exiting _queue_task() for managed-node3/command 26264 1727204241.79873: done queuing things up, now waiting for results queue to drain 26264 1727204241.79875: waiting for pending results... 26264 1727204241.80013: running TaskExecutor() for managed-node3/TASK: Gather current interface info 26264 1727204241.80081: in run() - task 0affcd87-79f5-5ff5-08b0-000000000115 26264 1727204241.80091: variable 'ansible_search_path' from source: unknown 26264 1727204241.80095: variable 'ansible_search_path' from source: unknown 26264 1727204241.80125: calling self._execute() 26264 1727204241.80185: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204241.80188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204241.80196: variable 'omit' from source: magic vars 26264 1727204241.80457: variable 'ansible_distribution_major_version' from source: facts 26264 1727204241.80471: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204241.80477: variable 'omit' from source: magic vars 26264 1727204241.80509: variable 'omit' from source: magic vars 26264 1727204241.80535: variable 'omit' from source: magic vars 26264 1727204241.80573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204241.80599: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204241.80614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204241.80628: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204241.80639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204241.80667: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204241.80670: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204241.80675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204241.80737: Set connection var ansible_pipelining to False 26264 1727204241.80740: Set connection var ansible_connection to ssh 26264 1727204241.80743: Set connection var ansible_shell_type to sh 26264 1727204241.80751: Set connection var ansible_shell_executable to /bin/sh 26264 1727204241.80756: Set connection var ansible_timeout to 10 26264 1727204241.80766: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204241.80784: variable 'ansible_shell_executable' from source: unknown 26264 1727204241.80787: variable 'ansible_connection' from source: unknown 26264 1727204241.80790: variable 'ansible_module_compression' from source: unknown 26264 1727204241.80792: variable 'ansible_shell_type' from source: unknown 26264 1727204241.80795: variable 'ansible_shell_executable' from source: unknown 26264 1727204241.80797: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204241.80799: variable 'ansible_pipelining' from source: unknown 26264 1727204241.80801: variable 'ansible_timeout' from source: unknown 26264 1727204241.80806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204241.80912: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204241.80919: variable 'omit' from source: magic vars 26264 1727204241.80925: starting attempt loop 26264 1727204241.80928: running the handler 26264 1727204241.80939: _low_level_execute_command(): starting 26264 1727204241.80946: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204241.81612: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204241.81655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204241.81676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204241.81694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204241.81772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204241.83390: stdout chunk (state=3): >>>/root <<< 26264 1727204241.83532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204241.83541: stderr chunk (state=3): >>><<< 26264 1727204241.83545: stdout chunk (state=3): >>><<< 26264 1727204241.83562: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204241.83623: _low_level_execute_command(): starting 26264 1727204241.83628: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204241.835624-26665-165531558523059 `" && echo ansible-tmp-1727204241.835624-26665-165531558523059="` echo /root/.ansible/tmp/ansible-tmp-1727204241.835624-26665-165531558523059 `" ) && sleep 0' 26264 1727204241.84011: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204241.84023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204241.84042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204241.84062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204241.84127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204241.84138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204241.84197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204241.86001: stdout chunk (state=3): >>>ansible-tmp-1727204241.835624-26665-165531558523059=/root/.ansible/tmp/ansible-tmp-1727204241.835624-26665-165531558523059 <<< 26264 1727204241.86113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204241.86156: stderr chunk (state=3): >>><<< 26264 1727204241.86160: stdout chunk (state=3): >>><<< 26264 1727204241.86178: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204241.835624-26665-165531558523059=/root/.ansible/tmp/ansible-tmp-1727204241.835624-26665-165531558523059 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204241.86204: variable 'ansible_module_compression' from source: unknown 26264 1727204241.86243: ANSIBALLZ: Using generic lock for ansible.legacy.command 26264 1727204241.86246: ANSIBALLZ: Acquiring lock 26264 1727204241.86252: ANSIBALLZ: Lock acquired: 139841028923536 26264 1727204241.86254: ANSIBALLZ: Creating module 26264 1727204241.96963: ANSIBALLZ: Writing module into payload 26264 1727204241.97078: ANSIBALLZ: Writing module 26264 1727204241.97105: ANSIBALLZ: Renaming module 26264 1727204241.97114: ANSIBALLZ: Done creating module 26264 1727204241.97136: variable 'ansible_facts' from source: unknown 26264 1727204241.97214: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204241.835624-26665-165531558523059/AnsiballZ_command.py 26264 1727204241.97389: Sending initial data 26264 1727204241.97393: Sent initial data (155 bytes) 26264 1727204241.98381: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204241.98397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204241.98412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204241.98432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204241.98479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204241.98492: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204241.98507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204241.98525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204241.98538: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204241.98556: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204241.98572: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204241.98586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204241.98604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204241.98617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204241.98629: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204241.98643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204241.98726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204241.98754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204241.98759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204241.98802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204242.00591: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204242.00635: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204242.00689: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmp5sznrabj /root/.ansible/tmp/ansible-tmp-1727204241.835624-26665-165531558523059/AnsiballZ_command.py <<< 26264 1727204242.00716: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204242.01861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204242.02083: stderr chunk (state=3): >>><<< 26264 1727204242.02086: stdout chunk (state=3): >>><<< 26264 1727204242.02088: done transferring module to remote 26264 1727204242.02090: _low_level_execute_command(): starting 26264 1727204242.02092: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204241.835624-26665-165531558523059/ /root/.ansible/tmp/ansible-tmp-1727204241.835624-26665-165531558523059/AnsiballZ_command.py && sleep 0' 26264 1727204242.03003: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204242.03013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.03022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.03036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.03079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.03086: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204242.03099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.03113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204242.03121: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204242.03133: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204242.03145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.03170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.03187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.03200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.03215: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204242.03223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.03302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204242.03320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204242.03337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204242.03411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204242.05182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204242.05268: stderr chunk (state=3): >>><<< 26264 1727204242.05272: stdout chunk (state=3): >>><<< 26264 1727204242.05376: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204242.05380: _low_level_execute_command(): starting 26264 1727204242.05383: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204241.835624-26665-165531558523059/AnsiballZ_command.py && sleep 0' 26264 1727204242.05986: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204242.06001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.06017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.06046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.06092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.06104: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204242.06162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.06182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204242.06194: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204242.06205: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204242.06218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.06232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.06251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.06274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.06286: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204242.06300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.06384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204242.06400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204242.06414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204242.07482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204242.20007: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:22.195550", "end": "2024-09-24 14:57:22.198860", "delta": "0:00:00.003310", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 26264 1727204242.21408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204242.21413: stdout chunk (state=3): >>><<< 26264 1727204242.21415: stderr chunk (state=3): >>><<< 26264 1727204242.21566: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:22.195550", "end": "2024-09-24 14:57:22.198860", "delta": "0:00:00.003310", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204242.21570: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204241.835624-26665-165531558523059/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204242.21578: _low_level_execute_command(): starting 26264 1727204242.21581: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204241.835624-26665-165531558523059/ > /dev/null 2>&1 && sleep 0' 26264 1727204242.22600: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.22603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.22637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.22641: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.22643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.22718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204242.22736: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204242.22807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204242.24589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204242.24702: stderr chunk (state=3): >>><<< 26264 1727204242.24709: stdout chunk (state=3): >>><<< 26264 1727204242.24995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204242.24998: handler run complete 26264 1727204242.25001: Evaluated conditional (False): False 26264 1727204242.25003: attempt loop complete, returning result 26264 1727204242.25005: _execute() done 26264 1727204242.25007: dumping result to json 26264 1727204242.25009: done dumping result, returning 26264 1727204242.25011: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [0affcd87-79f5-5ff5-08b0-000000000115] 26264 1727204242.25013: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000115 26264 1727204242.25114: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000115 26264 1727204242.25118: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003310", "end": "2024-09-24 14:57:22.198860", "rc": 0, "start": "2024-09-24 14:57:22.195550" } STDOUT: bonding_masters eth0 lo 26264 1727204242.25201: no more pending results, returning what we have 26264 1727204242.25205: results queue empty 26264 1727204242.25206: checking for any_errors_fatal 26264 1727204242.25207: done checking for any_errors_fatal 26264 1727204242.25208: checking for max_fail_percentage 26264 1727204242.25210: done checking for max_fail_percentage 26264 1727204242.25211: checking to see if all hosts have failed and the running result is not ok 26264 1727204242.25212: done checking to see if all hosts have failed 26264 1727204242.25213: getting the remaining hosts for this loop 26264 1727204242.25215: done getting the remaining hosts for this loop 26264 1727204242.25220: getting the next task for host managed-node3 26264 1727204242.25229: done getting next task for host managed-node3 26264 1727204242.25232: ^ task is: TASK: Set current_interfaces 26264 1727204242.25236: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204242.25240: getting variables 26264 1727204242.25242: in VariableManager get_vars() 26264 1727204242.25275: Calling all_inventory to load vars for managed-node3 26264 1727204242.25279: Calling groups_inventory to load vars for managed-node3 26264 1727204242.25283: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204242.25296: Calling all_plugins_play to load vars for managed-node3 26264 1727204242.25298: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204242.25302: Calling groups_plugins_play to load vars for managed-node3 26264 1727204242.25729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204242.25947: done with get_vars() 26264 1727204242.25959: done getting variables 26264 1727204242.26048: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:57:22 -0400 (0:00:00.464) 0:00:06.110 ***** 26264 1727204242.26088: entering _queue_task() for managed-node3/set_fact 26264 1727204242.26390: worker is 1 (out of 1 available) 26264 1727204242.26402: exiting _queue_task() for managed-node3/set_fact 26264 1727204242.26414: done queuing things up, now waiting for results queue to drain 26264 1727204242.26416: waiting for pending results... 26264 1727204242.26696: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 26264 1727204242.26827: in run() - task 0affcd87-79f5-5ff5-08b0-000000000116 26264 1727204242.26844: variable 'ansible_search_path' from source: unknown 26264 1727204242.26854: variable 'ansible_search_path' from source: unknown 26264 1727204242.26934: calling self._execute() 26264 1727204242.27092: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.27104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.27117: variable 'omit' from source: magic vars 26264 1727204242.27499: variable 'ansible_distribution_major_version' from source: facts 26264 1727204242.27516: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204242.27526: variable 'omit' from source: magic vars 26264 1727204242.27583: variable 'omit' from source: magic vars 26264 1727204242.27709: variable '_current_interfaces' from source: set_fact 26264 1727204242.27774: variable 'omit' from source: magic vars 26264 1727204242.27822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204242.27943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204242.27973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204242.27995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204242.28017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204242.28056: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204242.28066: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.28074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.28181: Set connection var ansible_pipelining to False 26264 1727204242.28188: Set connection var ansible_connection to ssh 26264 1727204242.28193: Set connection var ansible_shell_type to sh 26264 1727204242.28201: Set connection var ansible_shell_executable to /bin/sh 26264 1727204242.28209: Set connection var ansible_timeout to 10 26264 1727204242.28219: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204242.28251: variable 'ansible_shell_executable' from source: unknown 26264 1727204242.28258: variable 'ansible_connection' from source: unknown 26264 1727204242.28266: variable 'ansible_module_compression' from source: unknown 26264 1727204242.28272: variable 'ansible_shell_type' from source: unknown 26264 1727204242.28277: variable 'ansible_shell_executable' from source: unknown 26264 1727204242.28282: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.28288: variable 'ansible_pipelining' from source: unknown 26264 1727204242.28293: variable 'ansible_timeout' from source: unknown 26264 1727204242.28298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.28439: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204242.28462: variable 'omit' from source: magic vars 26264 1727204242.28476: starting attempt loop 26264 1727204242.28483: running the handler 26264 1727204242.28495: handler run complete 26264 1727204242.28508: attempt loop complete, returning result 26264 1727204242.28513: _execute() done 26264 1727204242.28519: dumping result to json 26264 1727204242.28525: done dumping result, returning 26264 1727204242.28534: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [0affcd87-79f5-5ff5-08b0-000000000116] 26264 1727204242.28542: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000116 26264 1727204242.28659: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000116 ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 26264 1727204242.28723: no more pending results, returning what we have 26264 1727204242.28727: results queue empty 26264 1727204242.28728: checking for any_errors_fatal 26264 1727204242.28734: done checking for any_errors_fatal 26264 1727204242.28734: checking for max_fail_percentage 26264 1727204242.28736: done checking for max_fail_percentage 26264 1727204242.28737: checking to see if all hosts have failed and the running result is not ok 26264 1727204242.28738: done checking to see if all hosts have failed 26264 1727204242.28739: getting the remaining hosts for this loop 26264 1727204242.28741: done getting the remaining hosts for this loop 26264 1727204242.28746: getting the next task for host managed-node3 26264 1727204242.28757: done getting next task for host managed-node3 26264 1727204242.28760: ^ task is: TASK: Show current_interfaces 26264 1727204242.28763: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204242.28770: getting variables 26264 1727204242.28772: in VariableManager get_vars() 26264 1727204242.28802: Calling all_inventory to load vars for managed-node3 26264 1727204242.28805: Calling groups_inventory to load vars for managed-node3 26264 1727204242.28809: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204242.28820: Calling all_plugins_play to load vars for managed-node3 26264 1727204242.28822: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204242.28826: Calling groups_plugins_play to load vars for managed-node3 26264 1727204242.29058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204242.29273: done with get_vars() 26264 1727204242.29283: done getting variables 26264 1727204242.29381: WORKER PROCESS EXITING 26264 1727204242.29417: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:57:22 -0400 (0:00:00.033) 0:00:06.143 ***** 26264 1727204242.29453: entering _queue_task() for managed-node3/debug 26264 1727204242.29885: worker is 1 (out of 1 available) 26264 1727204242.29897: exiting _queue_task() for managed-node3/debug 26264 1727204242.29908: done queuing things up, now waiting for results queue to drain 26264 1727204242.29910: waiting for pending results... 26264 1727204242.30160: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 26264 1727204242.30262: in run() - task 0affcd87-79f5-5ff5-08b0-000000000105 26264 1727204242.30284: variable 'ansible_search_path' from source: unknown 26264 1727204242.30291: variable 'ansible_search_path' from source: unknown 26264 1727204242.30328: calling self._execute() 26264 1727204242.30503: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.30513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.30526: variable 'omit' from source: magic vars 26264 1727204242.31110: variable 'ansible_distribution_major_version' from source: facts 26264 1727204242.31130: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204242.31139: variable 'omit' from source: magic vars 26264 1727204242.31179: variable 'omit' from source: magic vars 26264 1727204242.31437: variable 'current_interfaces' from source: set_fact 26264 1727204242.31540: variable 'omit' from source: magic vars 26264 1727204242.31594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204242.31709: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204242.31771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204242.31878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204242.31903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204242.32019: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204242.32030: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.32079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.32307: Set connection var ansible_pipelining to False 26264 1727204242.32316: Set connection var ansible_connection to ssh 26264 1727204242.32328: Set connection var ansible_shell_type to sh 26264 1727204242.32339: Set connection var ansible_shell_executable to /bin/sh 26264 1727204242.32355: Set connection var ansible_timeout to 10 26264 1727204242.32406: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204242.32461: variable 'ansible_shell_executable' from source: unknown 26264 1727204242.32514: variable 'ansible_connection' from source: unknown 26264 1727204242.32523: variable 'ansible_module_compression' from source: unknown 26264 1727204242.32530: variable 'ansible_shell_type' from source: unknown 26264 1727204242.32541: variable 'ansible_shell_executable' from source: unknown 26264 1727204242.32552: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.32624: variable 'ansible_pipelining' from source: unknown 26264 1727204242.32631: variable 'ansible_timeout' from source: unknown 26264 1727204242.32638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.32915: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204242.32950: variable 'omit' from source: magic vars 26264 1727204242.32962: starting attempt loop 26264 1727204242.32980: running the handler 26264 1727204242.33027: handler run complete 26264 1727204242.33062: attempt loop complete, returning result 26264 1727204242.33078: _execute() done 26264 1727204242.33085: dumping result to json 26264 1727204242.33093: done dumping result, returning 26264 1727204242.33104: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [0affcd87-79f5-5ff5-08b0-000000000105] 26264 1727204242.33114: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000105 26264 1727204242.33228: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000105 26264 1727204242.33236: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 26264 1727204242.33302: no more pending results, returning what we have 26264 1727204242.33306: results queue empty 26264 1727204242.33307: checking for any_errors_fatal 26264 1727204242.33312: done checking for any_errors_fatal 26264 1727204242.33313: checking for max_fail_percentage 26264 1727204242.33315: done checking for max_fail_percentage 26264 1727204242.33315: checking to see if all hosts have failed and the running result is not ok 26264 1727204242.33317: done checking to see if all hosts have failed 26264 1727204242.33317: getting the remaining hosts for this loop 26264 1727204242.33319: done getting the remaining hosts for this loop 26264 1727204242.33325: getting the next task for host managed-node3 26264 1727204242.33335: done getting next task for host managed-node3 26264 1727204242.33339: ^ task is: TASK: Include the task 'manage_test_interface.yml' 26264 1727204242.33341: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204242.33344: getting variables 26264 1727204242.33346: in VariableManager get_vars() 26264 1727204242.33382: Calling all_inventory to load vars for managed-node3 26264 1727204242.33385: Calling groups_inventory to load vars for managed-node3 26264 1727204242.33390: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204242.33401: Calling all_plugins_play to load vars for managed-node3 26264 1727204242.33404: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204242.33406: Calling groups_plugins_play to load vars for managed-node3 26264 1727204242.33599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204242.33793: done with get_vars() 26264 1727204242.33804: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:26 Tuesday 24 September 2024 14:57:22 -0400 (0:00:00.044) 0:00:06.188 ***** 26264 1727204242.33897: entering _queue_task() for managed-node3/include_tasks 26264 1727204242.34321: worker is 1 (out of 1 available) 26264 1727204242.34340: exiting _queue_task() for managed-node3/include_tasks 26264 1727204242.34355: done queuing things up, now waiting for results queue to drain 26264 1727204242.34357: waiting for pending results... 26264 1727204242.34653: running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' 26264 1727204242.34775: in run() - task 0affcd87-79f5-5ff5-08b0-000000000011 26264 1727204242.34800: variable 'ansible_search_path' from source: unknown 26264 1727204242.34842: calling self._execute() 26264 1727204242.34951: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.34962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.34977: variable 'omit' from source: magic vars 26264 1727204242.35537: variable 'ansible_distribution_major_version' from source: facts 26264 1727204242.35622: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204242.35635: _execute() done 26264 1727204242.35675: dumping result to json 26264 1727204242.35684: done dumping result, returning 26264 1727204242.35694: done running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' [0affcd87-79f5-5ff5-08b0-000000000011] 26264 1727204242.35719: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000011 26264 1727204242.35892: no more pending results, returning what we have 26264 1727204242.35898: in VariableManager get_vars() 26264 1727204242.35934: Calling all_inventory to load vars for managed-node3 26264 1727204242.35938: Calling groups_inventory to load vars for managed-node3 26264 1727204242.35942: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204242.35960: Calling all_plugins_play to load vars for managed-node3 26264 1727204242.35965: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204242.35969: Calling groups_plugins_play to load vars for managed-node3 26264 1727204242.36277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204242.36483: done with get_vars() 26264 1727204242.36491: variable 'ansible_search_path' from source: unknown 26264 1727204242.36507: we have included files to process 26264 1727204242.36508: generating all_blocks data 26264 1727204242.36510: done generating all_blocks data 26264 1727204242.36515: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 26264 1727204242.36517: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 26264 1727204242.36520: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 26264 1727204242.37123: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000011 26264 1727204242.37127: WORKER PROCESS EXITING 26264 1727204242.37533: in VariableManager get_vars() 26264 1727204242.37557: done with get_vars() 26264 1727204242.37842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 26264 1727204242.38694: done processing included file 26264 1727204242.38696: iterating over new_blocks loaded from include file 26264 1727204242.38698: in VariableManager get_vars() 26264 1727204242.38712: done with get_vars() 26264 1727204242.38713: filtering new block on tags 26264 1727204242.38747: done filtering new block on tags 26264 1727204242.38750: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node3 26264 1727204242.38760: extending task lists for all hosts with included blocks 26264 1727204242.38936: done extending task lists 26264 1727204242.38938: done processing included files 26264 1727204242.38939: results queue empty 26264 1727204242.38939: checking for any_errors_fatal 26264 1727204242.38943: done checking for any_errors_fatal 26264 1727204242.38944: checking for max_fail_percentage 26264 1727204242.38945: done checking for max_fail_percentage 26264 1727204242.38946: checking to see if all hosts have failed and the running result is not ok 26264 1727204242.38947: done checking to see if all hosts have failed 26264 1727204242.38947: getting the remaining hosts for this loop 26264 1727204242.38949: done getting the remaining hosts for this loop 26264 1727204242.38951: getting the next task for host managed-node3 26264 1727204242.38955: done getting next task for host managed-node3 26264 1727204242.38957: ^ task is: TASK: Ensure state in ["present", "absent"] 26264 1727204242.38959: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204242.38962: getting variables 26264 1727204242.38963: in VariableManager get_vars() 26264 1727204242.38977: Calling all_inventory to load vars for managed-node3 26264 1727204242.38979: Calling groups_inventory to load vars for managed-node3 26264 1727204242.38981: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204242.38987: Calling all_plugins_play to load vars for managed-node3 26264 1727204242.38989: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204242.38992: Calling groups_plugins_play to load vars for managed-node3 26264 1727204242.39139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204242.39338: done with get_vars() 26264 1727204242.39348: done getting variables 26264 1727204242.39419: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:57:22 -0400 (0:00:00.055) 0:00:06.243 ***** 26264 1727204242.39445: entering _queue_task() for managed-node3/fail 26264 1727204242.39446: Creating lock for fail 26264 1727204242.39772: worker is 1 (out of 1 available) 26264 1727204242.39785: exiting _queue_task() for managed-node3/fail 26264 1727204242.39798: done queuing things up, now waiting for results queue to drain 26264 1727204242.39800: waiting for pending results... 26264 1727204242.40357: running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] 26264 1727204242.40516: in run() - task 0affcd87-79f5-5ff5-08b0-000000000131 26264 1727204242.40536: variable 'ansible_search_path' from source: unknown 26264 1727204242.40551: variable 'ansible_search_path' from source: unknown 26264 1727204242.40598: calling self._execute() 26264 1727204242.40695: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.40707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.40721: variable 'omit' from source: magic vars 26264 1727204242.41161: variable 'ansible_distribution_major_version' from source: facts 26264 1727204242.41204: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204242.41407: variable 'state' from source: include params 26264 1727204242.41430: Evaluated conditional (state not in ["present", "absent"]): False 26264 1727204242.41445: when evaluation is False, skipping this task 26264 1727204242.41456: _execute() done 26264 1727204242.41465: dumping result to json 26264 1727204242.41474: done dumping result, returning 26264 1727204242.41484: done running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] [0affcd87-79f5-5ff5-08b0-000000000131] 26264 1727204242.41496: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000131 skipping: [managed-node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 26264 1727204242.41665: no more pending results, returning what we have 26264 1727204242.41670: results queue empty 26264 1727204242.41671: checking for any_errors_fatal 26264 1727204242.41673: done checking for any_errors_fatal 26264 1727204242.41674: checking for max_fail_percentage 26264 1727204242.41675: done checking for max_fail_percentage 26264 1727204242.41676: checking to see if all hosts have failed and the running result is not ok 26264 1727204242.41677: done checking to see if all hosts have failed 26264 1727204242.41678: getting the remaining hosts for this loop 26264 1727204242.41679: done getting the remaining hosts for this loop 26264 1727204242.41683: getting the next task for host managed-node3 26264 1727204242.41693: done getting next task for host managed-node3 26264 1727204242.41696: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 26264 1727204242.41700: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204242.41704: getting variables 26264 1727204242.41706: in VariableManager get_vars() 26264 1727204242.41736: Calling all_inventory to load vars for managed-node3 26264 1727204242.41740: Calling groups_inventory to load vars for managed-node3 26264 1727204242.41744: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204242.41761: Calling all_plugins_play to load vars for managed-node3 26264 1727204242.41766: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204242.41770: Calling groups_plugins_play to load vars for managed-node3 26264 1727204242.42003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204242.42218: done with get_vars() 26264 1727204242.42229: done getting variables 26264 1727204242.42405: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000131 26264 1727204242.42409: WORKER PROCESS EXITING 26264 1727204242.42441: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:57:22 -0400 (0:00:00.030) 0:00:06.273 ***** 26264 1727204242.42480: entering _queue_task() for managed-node3/fail 26264 1727204242.42941: worker is 1 (out of 1 available) 26264 1727204242.42956: exiting _queue_task() for managed-node3/fail 26264 1727204242.42969: done queuing things up, now waiting for results queue to drain 26264 1727204242.42971: waiting for pending results... 26264 1727204242.43119: running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] 26264 1727204242.43199: in run() - task 0affcd87-79f5-5ff5-08b0-000000000132 26264 1727204242.43210: variable 'ansible_search_path' from source: unknown 26264 1727204242.43214: variable 'ansible_search_path' from source: unknown 26264 1727204242.43243: calling self._execute() 26264 1727204242.43302: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.43306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.43316: variable 'omit' from source: magic vars 26264 1727204242.43662: variable 'ansible_distribution_major_version' from source: facts 26264 1727204242.43687: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204242.43840: variable 'type' from source: set_fact 26264 1727204242.43853: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 26264 1727204242.43860: when evaluation is False, skipping this task 26264 1727204242.43867: _execute() done 26264 1727204242.43873: dumping result to json 26264 1727204242.43879: done dumping result, returning 26264 1727204242.43899: done running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcd87-79f5-5ff5-08b0-000000000132] 26264 1727204242.43912: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000132 skipping: [managed-node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 26264 1727204242.44069: no more pending results, returning what we have 26264 1727204242.44073: results queue empty 26264 1727204242.44074: checking for any_errors_fatal 26264 1727204242.44082: done checking for any_errors_fatal 26264 1727204242.44083: checking for max_fail_percentage 26264 1727204242.44085: done checking for max_fail_percentage 26264 1727204242.44086: checking to see if all hosts have failed and the running result is not ok 26264 1727204242.44087: done checking to see if all hosts have failed 26264 1727204242.44088: getting the remaining hosts for this loop 26264 1727204242.44089: done getting the remaining hosts for this loop 26264 1727204242.44094: getting the next task for host managed-node3 26264 1727204242.44102: done getting next task for host managed-node3 26264 1727204242.44106: ^ task is: TASK: Include the task 'show_interfaces.yml' 26264 1727204242.44109: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204242.44114: getting variables 26264 1727204242.44115: in VariableManager get_vars() 26264 1727204242.44154: Calling all_inventory to load vars for managed-node3 26264 1727204242.44158: Calling groups_inventory to load vars for managed-node3 26264 1727204242.44162: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204242.44177: Calling all_plugins_play to load vars for managed-node3 26264 1727204242.44180: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204242.44183: Calling groups_plugins_play to load vars for managed-node3 26264 1727204242.44390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204242.44933: done with get_vars() 26264 1727204242.44945: done getting variables 26264 1727204242.45002: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000132 26264 1727204242.45006: WORKER PROCESS EXITING TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:57:22 -0400 (0:00:00.026) 0:00:06.300 ***** 26264 1727204242.45173: entering _queue_task() for managed-node3/include_tasks 26264 1727204242.45440: worker is 1 (out of 1 available) 26264 1727204242.45456: exiting _queue_task() for managed-node3/include_tasks 26264 1727204242.45474: done queuing things up, now waiting for results queue to drain 26264 1727204242.45476: waiting for pending results... 26264 1727204242.45735: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 26264 1727204242.45852: in run() - task 0affcd87-79f5-5ff5-08b0-000000000133 26264 1727204242.45874: variable 'ansible_search_path' from source: unknown 26264 1727204242.45881: variable 'ansible_search_path' from source: unknown 26264 1727204242.45917: calling self._execute() 26264 1727204242.45999: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.46009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.46024: variable 'omit' from source: magic vars 26264 1727204242.47123: variable 'ansible_distribution_major_version' from source: facts 26264 1727204242.47142: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204242.47153: _execute() done 26264 1727204242.47160: dumping result to json 26264 1727204242.47170: done dumping result, returning 26264 1727204242.47179: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-5ff5-08b0-000000000133] 26264 1727204242.47188: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000133 26264 1727204242.47318: no more pending results, returning what we have 26264 1727204242.47323: in VariableManager get_vars() 26264 1727204242.47361: Calling all_inventory to load vars for managed-node3 26264 1727204242.47370: Calling groups_inventory to load vars for managed-node3 26264 1727204242.47374: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204242.47388: Calling all_plugins_play to load vars for managed-node3 26264 1727204242.47391: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204242.47393: Calling groups_plugins_play to load vars for managed-node3 26264 1727204242.47594: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000133 26264 1727204242.47597: WORKER PROCESS EXITING 26264 1727204242.47618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204242.47927: done with get_vars() 26264 1727204242.47934: variable 'ansible_search_path' from source: unknown 26264 1727204242.47935: variable 'ansible_search_path' from source: unknown 26264 1727204242.47963: we have included files to process 26264 1727204242.47965: generating all_blocks data 26264 1727204242.47967: done generating all_blocks data 26264 1727204242.47969: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 26264 1727204242.47970: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 26264 1727204242.47972: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 26264 1727204242.48047: in VariableManager get_vars() 26264 1727204242.48061: done with get_vars() 26264 1727204242.48139: done processing included file 26264 1727204242.48141: iterating over new_blocks loaded from include file 26264 1727204242.48142: in VariableManager get_vars() 26264 1727204242.48151: done with get_vars() 26264 1727204242.48152: filtering new block on tags 26264 1727204242.48165: done filtering new block on tags 26264 1727204242.48167: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 26264 1727204242.48170: extending task lists for all hosts with included blocks 26264 1727204242.48399: done extending task lists 26264 1727204242.48400: done processing included files 26264 1727204242.48401: results queue empty 26264 1727204242.48401: checking for any_errors_fatal 26264 1727204242.48403: done checking for any_errors_fatal 26264 1727204242.48403: checking for max_fail_percentage 26264 1727204242.48404: done checking for max_fail_percentage 26264 1727204242.48405: checking to see if all hosts have failed and the running result is not ok 26264 1727204242.48405: done checking to see if all hosts have failed 26264 1727204242.48406: getting the remaining hosts for this loop 26264 1727204242.48406: done getting the remaining hosts for this loop 26264 1727204242.48408: getting the next task for host managed-node3 26264 1727204242.48411: done getting next task for host managed-node3 26264 1727204242.48412: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 26264 1727204242.48414: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204242.48416: getting variables 26264 1727204242.48416: in VariableManager get_vars() 26264 1727204242.48421: Calling all_inventory to load vars for managed-node3 26264 1727204242.48423: Calling groups_inventory to load vars for managed-node3 26264 1727204242.48424: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204242.48429: Calling all_plugins_play to load vars for managed-node3 26264 1727204242.48430: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204242.48435: Calling groups_plugins_play to load vars for managed-node3 26264 1727204242.48798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204242.48994: done with get_vars() 26264 1727204242.49002: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:57:22 -0400 (0:00:00.038) 0:00:06.339 ***** 26264 1727204242.49076: entering _queue_task() for managed-node3/include_tasks 26264 1727204242.49351: worker is 1 (out of 1 available) 26264 1727204242.49365: exiting _queue_task() for managed-node3/include_tasks 26264 1727204242.49378: done queuing things up, now waiting for results queue to drain 26264 1727204242.49380: waiting for pending results... 26264 1727204242.49637: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 26264 1727204242.50540: in run() - task 0affcd87-79f5-5ff5-08b0-00000000015c 26264 1727204242.50609: variable 'ansible_search_path' from source: unknown 26264 1727204242.50619: variable 'ansible_search_path' from source: unknown 26264 1727204242.50661: calling self._execute() 26264 1727204242.50794: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.50878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.50893: variable 'omit' from source: magic vars 26264 1727204242.51399: variable 'ansible_distribution_major_version' from source: facts 26264 1727204242.51420: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204242.51447: _execute() done 26264 1727204242.51455: dumping result to json 26264 1727204242.51462: done dumping result, returning 26264 1727204242.51473: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-5ff5-08b0-00000000015c] 26264 1727204242.51484: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000015c 26264 1727204242.51590: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000015c 26264 1727204242.51598: WORKER PROCESS EXITING 26264 1727204242.51987: no more pending results, returning what we have 26264 1727204242.51991: in VariableManager get_vars() 26264 1727204242.52019: Calling all_inventory to load vars for managed-node3 26264 1727204242.52022: Calling groups_inventory to load vars for managed-node3 26264 1727204242.52025: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204242.52035: Calling all_plugins_play to load vars for managed-node3 26264 1727204242.52037: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204242.52039: Calling groups_plugins_play to load vars for managed-node3 26264 1727204242.52199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204242.52407: done with get_vars() 26264 1727204242.52415: variable 'ansible_search_path' from source: unknown 26264 1727204242.52416: variable 'ansible_search_path' from source: unknown 26264 1727204242.52476: we have included files to process 26264 1727204242.52478: generating all_blocks data 26264 1727204242.52479: done generating all_blocks data 26264 1727204242.52481: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 26264 1727204242.52482: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 26264 1727204242.52484: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 26264 1727204242.52740: done processing included file 26264 1727204242.52742: iterating over new_blocks loaded from include file 26264 1727204242.52744: in VariableManager get_vars() 26264 1727204242.52761: done with get_vars() 26264 1727204242.52763: filtering new block on tags 26264 1727204242.52786: done filtering new block on tags 26264 1727204242.52792: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 26264 1727204242.52801: extending task lists for all hosts with included blocks 26264 1727204242.52951: done extending task lists 26264 1727204242.52952: done processing included files 26264 1727204242.52953: results queue empty 26264 1727204242.52954: checking for any_errors_fatal 26264 1727204242.52956: done checking for any_errors_fatal 26264 1727204242.52957: checking for max_fail_percentage 26264 1727204242.52958: done checking for max_fail_percentage 26264 1727204242.52959: checking to see if all hosts have failed and the running result is not ok 26264 1727204242.52960: done checking to see if all hosts have failed 26264 1727204242.52961: getting the remaining hosts for this loop 26264 1727204242.52962: done getting the remaining hosts for this loop 26264 1727204242.52966: getting the next task for host managed-node3 26264 1727204242.52970: done getting next task for host managed-node3 26264 1727204242.52972: ^ task is: TASK: Gather current interface info 26264 1727204242.52976: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204242.52978: getting variables 26264 1727204242.52979: in VariableManager get_vars() 26264 1727204242.52987: Calling all_inventory to load vars for managed-node3 26264 1727204242.52989: Calling groups_inventory to load vars for managed-node3 26264 1727204242.52991: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204242.52995: Calling all_plugins_play to load vars for managed-node3 26264 1727204242.53000: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204242.53004: Calling groups_plugins_play to load vars for managed-node3 26264 1727204242.53162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204242.53347: done with get_vars() 26264 1727204242.53356: done getting variables 26264 1727204242.53392: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:57:22 -0400 (0:00:00.043) 0:00:06.383 ***** 26264 1727204242.53418: entering _queue_task() for managed-node3/command 26264 1727204242.53645: worker is 1 (out of 1 available) 26264 1727204242.53659: exiting _queue_task() for managed-node3/command 26264 1727204242.53672: done queuing things up, now waiting for results queue to drain 26264 1727204242.53674: waiting for pending results... 26264 1727204242.54779: running TaskExecutor() for managed-node3/TASK: Gather current interface info 26264 1727204242.54886: in run() - task 0affcd87-79f5-5ff5-08b0-000000000193 26264 1727204242.54901: variable 'ansible_search_path' from source: unknown 26264 1727204242.54907: variable 'ansible_search_path' from source: unknown 26264 1727204242.54951: calling self._execute() 26264 1727204242.55036: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.55045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.55059: variable 'omit' from source: magic vars 26264 1727204242.55417: variable 'ansible_distribution_major_version' from source: facts 26264 1727204242.55436: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204242.55449: variable 'omit' from source: magic vars 26264 1727204242.55506: variable 'omit' from source: magic vars 26264 1727204242.55539: variable 'omit' from source: magic vars 26264 1727204242.55588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204242.55627: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204242.55655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204242.55682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204242.55695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204242.55723: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204242.55729: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.55735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.55836: Set connection var ansible_pipelining to False 26264 1727204242.55843: Set connection var ansible_connection to ssh 26264 1727204242.55852: Set connection var ansible_shell_type to sh 26264 1727204242.55862: Set connection var ansible_shell_executable to /bin/sh 26264 1727204242.55876: Set connection var ansible_timeout to 10 26264 1727204242.55890: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204242.55913: variable 'ansible_shell_executable' from source: unknown 26264 1727204242.55918: variable 'ansible_connection' from source: unknown 26264 1727204242.55924: variable 'ansible_module_compression' from source: unknown 26264 1727204242.55928: variable 'ansible_shell_type' from source: unknown 26264 1727204242.55933: variable 'ansible_shell_executable' from source: unknown 26264 1727204242.55937: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.55942: variable 'ansible_pipelining' from source: unknown 26264 1727204242.55950: variable 'ansible_timeout' from source: unknown 26264 1727204242.55956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.56091: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204242.56108: variable 'omit' from source: magic vars 26264 1727204242.56117: starting attempt loop 26264 1727204242.56122: running the handler 26264 1727204242.56137: _low_level_execute_command(): starting 26264 1727204242.56146: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204242.56904: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204242.56919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.56932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.56952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.57001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.57012: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204242.57024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.57041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204242.57055: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204242.57068: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204242.57083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.57096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.57110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.57121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.57131: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204242.57143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.57224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204242.57242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204242.57258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204242.57341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204242.58972: stdout chunk (state=3): >>>/root <<< 26264 1727204242.59081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204242.59183: stderr chunk (state=3): >>><<< 26264 1727204242.59194: stdout chunk (state=3): >>><<< 26264 1727204242.59323: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204242.59326: _low_level_execute_command(): starting 26264 1727204242.59329: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204242.5922403-26816-227080581825573 `" && echo ansible-tmp-1727204242.5922403-26816-227080581825573="` echo /root/.ansible/tmp/ansible-tmp-1727204242.5922403-26816-227080581825573 `" ) && sleep 0' 26264 1727204242.59955: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204242.59977: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.59992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.60008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.60054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.60069: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204242.60090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.60110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204242.60124: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204242.60138: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204242.60153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.60167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.60183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.60199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.60210: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204242.60222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.60304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204242.60329: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204242.60344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204242.60429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204242.62267: stdout chunk (state=3): >>>ansible-tmp-1727204242.5922403-26816-227080581825573=/root/.ansible/tmp/ansible-tmp-1727204242.5922403-26816-227080581825573 <<< 26264 1727204242.62474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204242.62478: stdout chunk (state=3): >>><<< 26264 1727204242.62481: stderr chunk (state=3): >>><<< 26264 1727204242.62810: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204242.5922403-26816-227080581825573=/root/.ansible/tmp/ansible-tmp-1727204242.5922403-26816-227080581825573 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204242.62814: variable 'ansible_module_compression' from source: unknown 26264 1727204242.62820: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 26264 1727204242.62823: variable 'ansible_facts' from source: unknown 26264 1727204242.62826: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204242.5922403-26816-227080581825573/AnsiballZ_command.py 26264 1727204242.62894: Sending initial data 26264 1727204242.62898: Sent initial data (156 bytes) 26264 1727204242.63906: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204242.63922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.63935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.63953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.63999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.64011: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204242.64024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.64040: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204242.64053: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204242.64062: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204242.64078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.64088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.64101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.64110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.64119: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204242.64135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.64220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204242.64244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204242.64263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204242.64340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204242.66037: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204242.66084: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204242.66127: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmp_aq351_k /root/.ansible/tmp/ansible-tmp-1727204242.5922403-26816-227080581825573/AnsiballZ_command.py <<< 26264 1727204242.66162: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204242.67284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204242.67502: stderr chunk (state=3): >>><<< 26264 1727204242.67505: stdout chunk (state=3): >>><<< 26264 1727204242.67507: done transferring module to remote 26264 1727204242.67515: _low_level_execute_command(): starting 26264 1727204242.67517: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204242.5922403-26816-227080581825573/ /root/.ansible/tmp/ansible-tmp-1727204242.5922403-26816-227080581825573/AnsiballZ_command.py && sleep 0' 26264 1727204242.68144: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204242.68158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.68178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.68197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.68239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.68250: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204242.68263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.68285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204242.68299: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204242.68309: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204242.68320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.68331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.68345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.68356: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.68367: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204242.68381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.68495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204242.68523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204242.68538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204242.68656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204242.70381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204242.70467: stderr chunk (state=3): >>><<< 26264 1727204242.70470: stdout chunk (state=3): >>><<< 26264 1727204242.70569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204242.70573: _low_level_execute_command(): starting 26264 1727204242.70576: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204242.5922403-26816-227080581825573/AnsiballZ_command.py && sleep 0' 26264 1727204242.71984: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204242.72000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.72023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.72043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.72088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.72103: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204242.72116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.72134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204242.72150: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204242.72162: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204242.72178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.72194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.72214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.72228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.72242: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204242.72260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.72351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204242.72370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204242.72384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204242.72553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204242.85960: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:22.855146", "end": "2024-09-24 14:57:22.858407", "delta": "0:00:00.003261", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 26264 1727204242.87075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204242.87151: stderr chunk (state=3): >>><<< 26264 1727204242.87155: stdout chunk (state=3): >>><<< 26264 1727204242.87303: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:22.855146", "end": "2024-09-24 14:57:22.858407", "delta": "0:00:00.003261", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204242.87311: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204242.5922403-26816-227080581825573/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204242.87313: _low_level_execute_command(): starting 26264 1727204242.87316: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204242.5922403-26816-227080581825573/ > /dev/null 2>&1 && sleep 0' 26264 1727204242.88907: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204242.88922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.89182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.89204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.89250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.89262: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204242.89278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.89295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204242.89306: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204242.89316: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204242.89326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204242.89338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204242.89352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204242.89367: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204242.89380: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204242.89396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204242.89476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204242.89498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204242.89513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204242.89587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204242.91383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204242.91458: stderr chunk (state=3): >>><<< 26264 1727204242.91461: stdout chunk (state=3): >>><<< 26264 1727204242.91572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204242.91575: handler run complete 26264 1727204242.91577: Evaluated conditional (False): False 26264 1727204242.91579: attempt loop complete, returning result 26264 1727204242.91581: _execute() done 26264 1727204242.91583: dumping result to json 26264 1727204242.91585: done dumping result, returning 26264 1727204242.91587: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [0affcd87-79f5-5ff5-08b0-000000000193] 26264 1727204242.91589: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000193 26264 1727204242.91749: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000193 26264 1727204242.91752: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003261", "end": "2024-09-24 14:57:22.858407", "rc": 0, "start": "2024-09-24 14:57:22.855146" } STDOUT: bonding_masters eth0 lo 26264 1727204242.91829: no more pending results, returning what we have 26264 1727204242.91832: results queue empty 26264 1727204242.91833: checking for any_errors_fatal 26264 1727204242.91834: done checking for any_errors_fatal 26264 1727204242.91835: checking for max_fail_percentage 26264 1727204242.91836: done checking for max_fail_percentage 26264 1727204242.91837: checking to see if all hosts have failed and the running result is not ok 26264 1727204242.91838: done checking to see if all hosts have failed 26264 1727204242.91839: getting the remaining hosts for this loop 26264 1727204242.91840: done getting the remaining hosts for this loop 26264 1727204242.91844: getting the next task for host managed-node3 26264 1727204242.91853: done getting next task for host managed-node3 26264 1727204242.91856: ^ task is: TASK: Set current_interfaces 26264 1727204242.91862: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204242.91867: getting variables 26264 1727204242.91869: in VariableManager get_vars() 26264 1727204242.91898: Calling all_inventory to load vars for managed-node3 26264 1727204242.91901: Calling groups_inventory to load vars for managed-node3 26264 1727204242.91904: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204242.91916: Calling all_plugins_play to load vars for managed-node3 26264 1727204242.91918: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204242.91925: Calling groups_plugins_play to load vars for managed-node3 26264 1727204242.92108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204242.92322: done with get_vars() 26264 1727204242.92334: done getting variables 26264 1727204242.92401: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:57:22 -0400 (0:00:00.390) 0:00:06.773 ***** 26264 1727204242.92436: entering _queue_task() for managed-node3/set_fact 26264 1727204242.92913: worker is 1 (out of 1 available) 26264 1727204242.92926: exiting _queue_task() for managed-node3/set_fact 26264 1727204242.92941: done queuing things up, now waiting for results queue to drain 26264 1727204242.92942: waiting for pending results... 26264 1727204242.93572: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 26264 1727204242.93666: in run() - task 0affcd87-79f5-5ff5-08b0-000000000194 26264 1727204242.93931: variable 'ansible_search_path' from source: unknown 26264 1727204242.93942: variable 'ansible_search_path' from source: unknown 26264 1727204242.94023: calling self._execute() 26264 1727204242.94122: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.94136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.94151: variable 'omit' from source: magic vars 26264 1727204242.94711: variable 'ansible_distribution_major_version' from source: facts 26264 1727204242.94738: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204242.94749: variable 'omit' from source: magic vars 26264 1727204242.94823: variable 'omit' from source: magic vars 26264 1727204242.94948: variable '_current_interfaces' from source: set_fact 26264 1727204242.95020: variable 'omit' from source: magic vars 26264 1727204242.95079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204242.95132: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204242.95158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204242.95186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204242.95202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204242.95242: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204242.95250: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.95258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.95370: Set connection var ansible_pipelining to False 26264 1727204242.95378: Set connection var ansible_connection to ssh 26264 1727204242.95389: Set connection var ansible_shell_type to sh 26264 1727204242.95406: Set connection var ansible_shell_executable to /bin/sh 26264 1727204242.95419: Set connection var ansible_timeout to 10 26264 1727204242.95435: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204242.95468: variable 'ansible_shell_executable' from source: unknown 26264 1727204242.95477: variable 'ansible_connection' from source: unknown 26264 1727204242.95483: variable 'ansible_module_compression' from source: unknown 26264 1727204242.95490: variable 'ansible_shell_type' from source: unknown 26264 1727204242.95501: variable 'ansible_shell_executable' from source: unknown 26264 1727204242.95508: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.95515: variable 'ansible_pipelining' from source: unknown 26264 1727204242.95520: variable 'ansible_timeout' from source: unknown 26264 1727204242.95527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.96021: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204242.96039: variable 'omit' from source: magic vars 26264 1727204242.96049: starting attempt loop 26264 1727204242.96055: running the handler 26264 1727204242.96072: handler run complete 26264 1727204242.96086: attempt loop complete, returning result 26264 1727204242.96097: _execute() done 26264 1727204242.96102: dumping result to json 26264 1727204242.96109: done dumping result, returning 26264 1727204242.96118: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [0affcd87-79f5-5ff5-08b0-000000000194] 26264 1727204242.96127: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000194 ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 26264 1727204242.96280: no more pending results, returning what we have 26264 1727204242.96283: results queue empty 26264 1727204242.96284: checking for any_errors_fatal 26264 1727204242.96291: done checking for any_errors_fatal 26264 1727204242.96291: checking for max_fail_percentage 26264 1727204242.96293: done checking for max_fail_percentage 26264 1727204242.96295: checking to see if all hosts have failed and the running result is not ok 26264 1727204242.96296: done checking to see if all hosts have failed 26264 1727204242.96297: getting the remaining hosts for this loop 26264 1727204242.96298: done getting the remaining hosts for this loop 26264 1727204242.96303: getting the next task for host managed-node3 26264 1727204242.96312: done getting next task for host managed-node3 26264 1727204242.96315: ^ task is: TASK: Show current_interfaces 26264 1727204242.96319: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204242.96323: getting variables 26264 1727204242.96327: in VariableManager get_vars() 26264 1727204242.96356: Calling all_inventory to load vars for managed-node3 26264 1727204242.96359: Calling groups_inventory to load vars for managed-node3 26264 1727204242.96363: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204242.96377: Calling all_plugins_play to load vars for managed-node3 26264 1727204242.96380: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204242.96383: Calling groups_plugins_play to load vars for managed-node3 26264 1727204242.96623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204242.96825: done with get_vars() 26264 1727204242.96837: done getting variables 26264 1727204242.96908: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000194 26264 1727204242.96912: WORKER PROCESS EXITING 26264 1727204242.96946: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:57:22 -0400 (0:00:00.047) 0:00:06.820 ***** 26264 1727204242.97141: entering _queue_task() for managed-node3/debug 26264 1727204242.97614: worker is 1 (out of 1 available) 26264 1727204242.97628: exiting _queue_task() for managed-node3/debug 26264 1727204242.97641: done queuing things up, now waiting for results queue to drain 26264 1727204242.97643: waiting for pending results... 26264 1727204242.98161: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 26264 1727204242.98396: in run() - task 0affcd87-79f5-5ff5-08b0-00000000015d 26264 1727204242.98415: variable 'ansible_search_path' from source: unknown 26264 1727204242.98477: variable 'ansible_search_path' from source: unknown 26264 1727204242.98514: calling self._execute() 26264 1727204242.98650: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.98661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.98677: variable 'omit' from source: magic vars 26264 1727204242.99050: variable 'ansible_distribution_major_version' from source: facts 26264 1727204242.99125: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204242.99137: variable 'omit' from source: magic vars 26264 1727204242.99188: variable 'omit' from source: magic vars 26264 1727204242.99344: variable 'current_interfaces' from source: set_fact 26264 1727204242.99377: variable 'omit' from source: magic vars 26264 1727204242.99417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204242.99459: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204242.99488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204242.99510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204242.99525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204242.99560: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204242.99571: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.99578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204242.99681: Set connection var ansible_pipelining to False 26264 1727204242.99689: Set connection var ansible_connection to ssh 26264 1727204242.99695: Set connection var ansible_shell_type to sh 26264 1727204242.99704: Set connection var ansible_shell_executable to /bin/sh 26264 1727204242.99718: Set connection var ansible_timeout to 10 26264 1727204242.99728: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204242.99753: variable 'ansible_shell_executable' from source: unknown 26264 1727204242.99760: variable 'ansible_connection' from source: unknown 26264 1727204242.99772: variable 'ansible_module_compression' from source: unknown 26264 1727204242.99779: variable 'ansible_shell_type' from source: unknown 26264 1727204242.99785: variable 'ansible_shell_executable' from source: unknown 26264 1727204242.99791: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204242.99798: variable 'ansible_pipelining' from source: unknown 26264 1727204242.99804: variable 'ansible_timeout' from source: unknown 26264 1727204242.99810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204243.00214: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204243.00231: variable 'omit' from source: magic vars 26264 1727204243.00242: starting attempt loop 26264 1727204243.00248: running the handler 26264 1727204243.00301: handler run complete 26264 1727204243.00319: attempt loop complete, returning result 26264 1727204243.00327: _execute() done 26264 1727204243.00333: dumping result to json 26264 1727204243.00340: done dumping result, returning 26264 1727204243.00351: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [0affcd87-79f5-5ff5-08b0-00000000015d] 26264 1727204243.00359: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000015d ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 26264 1727204243.00516: no more pending results, returning what we have 26264 1727204243.00520: results queue empty 26264 1727204243.00521: checking for any_errors_fatal 26264 1727204243.00528: done checking for any_errors_fatal 26264 1727204243.00529: checking for max_fail_percentage 26264 1727204243.00531: done checking for max_fail_percentage 26264 1727204243.00532: checking to see if all hosts have failed and the running result is not ok 26264 1727204243.00533: done checking to see if all hosts have failed 26264 1727204243.00534: getting the remaining hosts for this loop 26264 1727204243.00536: done getting the remaining hosts for this loop 26264 1727204243.00540: getting the next task for host managed-node3 26264 1727204243.00550: done getting next task for host managed-node3 26264 1727204243.00554: ^ task is: TASK: Install iproute 26264 1727204243.00558: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204243.00565: getting variables 26264 1727204243.00567: in VariableManager get_vars() 26264 1727204243.00598: Calling all_inventory to load vars for managed-node3 26264 1727204243.00602: Calling groups_inventory to load vars for managed-node3 26264 1727204243.00606: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204243.00617: Calling all_plugins_play to load vars for managed-node3 26264 1727204243.00619: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204243.00622: Calling groups_plugins_play to load vars for managed-node3 26264 1727204243.00806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204243.00975: done with get_vars() 26264 1727204243.00986: done getting variables 26264 1727204243.01036: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.039) 0:00:06.859 ***** 26264 1727204243.01075: entering _queue_task() for managed-node3/package 26264 1727204243.01095: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000015d 26264 1727204243.01175: WORKER PROCESS EXITING 26264 1727204243.01669: worker is 1 (out of 1 available) 26264 1727204243.01683: exiting _queue_task() for managed-node3/package 26264 1727204243.01698: done queuing things up, now waiting for results queue to drain 26264 1727204243.01699: waiting for pending results... 26264 1727204243.02388: running TaskExecutor() for managed-node3/TASK: Install iproute 26264 1727204243.02528: in run() - task 0affcd87-79f5-5ff5-08b0-000000000134 26264 1727204243.02750: variable 'ansible_search_path' from source: unknown 26264 1727204243.02763: variable 'ansible_search_path' from source: unknown 26264 1727204243.02907: calling self._execute() 26264 1727204243.03007: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204243.03090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204243.03104: variable 'omit' from source: magic vars 26264 1727204243.04504: variable 'ansible_distribution_major_version' from source: facts 26264 1727204243.04526: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204243.04538: variable 'omit' from source: magic vars 26264 1727204243.04586: variable 'omit' from source: magic vars 26264 1727204243.04790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204243.07329: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204243.07412: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204243.07459: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204243.07505: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204243.07562: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204243.07903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204243.07938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204243.07976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204243.08021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204243.08041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204243.08152: variable '__network_is_ostree' from source: set_fact 26264 1727204243.08171: variable 'omit' from source: magic vars 26264 1727204243.08208: variable 'omit' from source: magic vars 26264 1727204243.08239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204243.08278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204243.08301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204243.08324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204243.08340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204243.08383: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204243.08392: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204243.08399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204243.08506: Set connection var ansible_pipelining to False 26264 1727204243.08514: Set connection var ansible_connection to ssh 26264 1727204243.08536: Set connection var ansible_shell_type to sh 26264 1727204243.08552: Set connection var ansible_shell_executable to /bin/sh 26264 1727204243.08569: Set connection var ansible_timeout to 10 26264 1727204243.08585: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204243.08622: variable 'ansible_shell_executable' from source: unknown 26264 1727204243.08631: variable 'ansible_connection' from source: unknown 26264 1727204243.08638: variable 'ansible_module_compression' from source: unknown 26264 1727204243.08650: variable 'ansible_shell_type' from source: unknown 26264 1727204243.08659: variable 'ansible_shell_executable' from source: unknown 26264 1727204243.08668: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204243.08677: variable 'ansible_pipelining' from source: unknown 26264 1727204243.08684: variable 'ansible_timeout' from source: unknown 26264 1727204243.08692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204243.08806: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204243.08827: variable 'omit' from source: magic vars 26264 1727204243.08838: starting attempt loop 26264 1727204243.08844: running the handler 26264 1727204243.08858: variable 'ansible_facts' from source: unknown 26264 1727204243.08869: variable 'ansible_facts' from source: unknown 26264 1727204243.08906: _low_level_execute_command(): starting 26264 1727204243.08917: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204243.09638: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204243.09642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204243.09671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204243.09674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204243.09677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204243.09731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204243.09734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204243.09788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204243.11404: stdout chunk (state=3): >>>/root <<< 26264 1727204243.11604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204243.11607: stdout chunk (state=3): >>><<< 26264 1727204243.11609: stderr chunk (state=3): >>><<< 26264 1727204243.11718: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204243.11730: _low_level_execute_command(): starting 26264 1727204243.11733: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204243.1162927-26878-203399405482857 `" && echo ansible-tmp-1727204243.1162927-26878-203399405482857="` echo /root/.ansible/tmp/ansible-tmp-1727204243.1162927-26878-203399405482857 `" ) && sleep 0' 26264 1727204243.12280: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204243.12296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204243.12312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204243.12331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204243.12378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204243.12392: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204243.12406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204243.12423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204243.12436: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204243.12446: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204243.12462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204243.12481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204243.12498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204243.12510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204243.12520: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204243.12532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204243.12611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204243.12627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204243.12640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204243.12707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204243.14534: stdout chunk (state=3): >>>ansible-tmp-1727204243.1162927-26878-203399405482857=/root/.ansible/tmp/ansible-tmp-1727204243.1162927-26878-203399405482857 <<< 26264 1727204243.14646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204243.14727: stderr chunk (state=3): >>><<< 26264 1727204243.14729: stdout chunk (state=3): >>><<< 26264 1727204243.14775: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204243.1162927-26878-203399405482857=/root/.ansible/tmp/ansible-tmp-1727204243.1162927-26878-203399405482857 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204243.14814: variable 'ansible_module_compression' from source: unknown 26264 1727204243.14889: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 26264 1727204243.14893: ANSIBALLZ: Acquiring lock 26264 1727204243.14895: ANSIBALLZ: Lock acquired: 139841028923536 26264 1727204243.14897: ANSIBALLZ: Creating module 26264 1727204243.27073: ANSIBALLZ: Writing module into payload 26264 1727204243.27265: ANSIBALLZ: Writing module 26264 1727204243.27285: ANSIBALLZ: Renaming module 26264 1727204243.27295: ANSIBALLZ: Done creating module 26264 1727204243.27315: variable 'ansible_facts' from source: unknown 26264 1727204243.27377: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204243.1162927-26878-203399405482857/AnsiballZ_dnf.py 26264 1727204243.27485: Sending initial data 26264 1727204243.27488: Sent initial data (152 bytes) 26264 1727204243.28174: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204243.28178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204243.28212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204243.28321: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204243.28324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204243.28327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204243.28397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204243.30144: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204243.30187: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204243.30223: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpk_27k2ur /root/.ansible/tmp/ansible-tmp-1727204243.1162927-26878-203399405482857/AnsiballZ_dnf.py <<< 26264 1727204243.30274: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204243.31833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204243.31963: stderr chunk (state=3): >>><<< 26264 1727204243.31969: stdout chunk (state=3): >>><<< 26264 1727204243.31971: done transferring module to remote 26264 1727204243.31977: _low_level_execute_command(): starting 26264 1727204243.31980: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204243.1162927-26878-203399405482857/ /root/.ansible/tmp/ansible-tmp-1727204243.1162927-26878-203399405482857/AnsiballZ_dnf.py && sleep 0' 26264 1727204243.32606: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204243.32628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204243.32644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204243.32667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204243.32711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204243.32728: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204243.32748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204243.32769: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204243.32782: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204243.32794: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204243.32807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204243.32821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204243.32844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204243.32859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204243.32874: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204243.32888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204243.32972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204243.32990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204243.33004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204243.33185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204243.34799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204243.34887: stderr chunk (state=3): >>><<< 26264 1727204243.34897: stdout chunk (state=3): >>><<< 26264 1727204243.35002: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204243.35006: _low_level_execute_command(): starting 26264 1727204243.35008: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204243.1162927-26878-203399405482857/AnsiballZ_dnf.py && sleep 0' 26264 1727204243.35620: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204243.35636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204243.35655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204243.35686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204243.35727: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204243.35738: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204243.35750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204243.35769: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204243.35782: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204243.35791: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204243.35801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204243.35811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204243.35823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204243.35833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204243.35841: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204243.35853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204243.35928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204243.35951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204243.35972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204243.36081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204244.27036: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 26264 1727204244.31160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204244.31166: stdout chunk (state=3): >>><<< 26264 1727204244.31169: stderr chunk (state=3): >>><<< 26264 1727204244.31320: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204244.31329: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204243.1162927-26878-203399405482857/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204244.31332: _low_level_execute_command(): starting 26264 1727204244.31334: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204243.1162927-26878-203399405482857/ > /dev/null 2>&1 && sleep 0' 26264 1727204244.31890: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.31899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.31937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.31957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.31973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.31991: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204244.32075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.32106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204244.32109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204244.32125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204244.32210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204244.33962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204244.34006: stderr chunk (state=3): >>><<< 26264 1727204244.34009: stdout chunk (state=3): >>><<< 26264 1727204244.34021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204244.34028: handler run complete 26264 1727204244.34142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204244.34266: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204244.34297: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204244.34320: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204244.34342: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204244.34397: variable '__install_status' from source: unknown 26264 1727204244.34411: Evaluated conditional (__install_status is success): True 26264 1727204244.34426: attempt loop complete, returning result 26264 1727204244.34429: _execute() done 26264 1727204244.34466: dumping result to json 26264 1727204244.34471: done dumping result, returning 26264 1727204244.34473: done running TaskExecutor() for managed-node3/TASK: Install iproute [0affcd87-79f5-5ff5-08b0-000000000134] 26264 1727204244.34477: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000134 ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 26264 1727204244.34640: no more pending results, returning what we have 26264 1727204244.34643: results queue empty 26264 1727204244.34644: checking for any_errors_fatal 26264 1727204244.34652: done checking for any_errors_fatal 26264 1727204244.34652: checking for max_fail_percentage 26264 1727204244.34654: done checking for max_fail_percentage 26264 1727204244.34655: checking to see if all hosts have failed and the running result is not ok 26264 1727204244.34656: done checking to see if all hosts have failed 26264 1727204244.34656: getting the remaining hosts for this loop 26264 1727204244.34658: done getting the remaining hosts for this loop 26264 1727204244.34662: getting the next task for host managed-node3 26264 1727204244.34674: done getting next task for host managed-node3 26264 1727204244.34676: ^ task is: TASK: Create veth interface {{ interface }} 26264 1727204244.34679: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204244.34683: getting variables 26264 1727204244.34684: in VariableManager get_vars() 26264 1727204244.34773: Calling all_inventory to load vars for managed-node3 26264 1727204244.34781: Calling groups_inventory to load vars for managed-node3 26264 1727204244.34785: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204244.34793: Calling all_plugins_play to load vars for managed-node3 26264 1727204244.34796: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204244.34798: Calling groups_plugins_play to load vars for managed-node3 26264 1727204244.34952: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000134 26264 1727204244.34955: WORKER PROCESS EXITING 26264 1727204244.35000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204244.35207: done with get_vars() 26264 1727204244.35222: done getting variables 26264 1727204244.35285: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 26264 1727204244.35411: variable 'interface' from source: set_fact TASK [Create veth interface lsr27] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:57:24 -0400 (0:00:01.343) 0:00:08.203 ***** 26264 1727204244.35444: entering _queue_task() for managed-node3/command 26264 1727204244.35861: worker is 1 (out of 1 available) 26264 1727204244.35875: exiting _queue_task() for managed-node3/command 26264 1727204244.35890: done queuing things up, now waiting for results queue to drain 26264 1727204244.35891: waiting for pending results... 26264 1727204244.36028: running TaskExecutor() for managed-node3/TASK: Create veth interface lsr27 26264 1727204244.36099: in run() - task 0affcd87-79f5-5ff5-08b0-000000000135 26264 1727204244.36112: variable 'ansible_search_path' from source: unknown 26264 1727204244.36115: variable 'ansible_search_path' from source: unknown 26264 1727204244.36304: variable 'interface' from source: set_fact 26264 1727204244.36363: variable 'interface' from source: set_fact 26264 1727204244.36414: variable 'interface' from source: set_fact 26264 1727204244.36516: Loaded config def from plugin (lookup/items) 26264 1727204244.36520: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 26264 1727204244.36542: variable 'omit' from source: magic vars 26264 1727204244.36619: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204244.36626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204244.36635: variable 'omit' from source: magic vars 26264 1727204244.36787: variable 'ansible_distribution_major_version' from source: facts 26264 1727204244.36793: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204244.36920: variable 'type' from source: set_fact 26264 1727204244.36924: variable 'state' from source: include params 26264 1727204244.36927: variable 'interface' from source: set_fact 26264 1727204244.36929: variable 'current_interfaces' from source: set_fact 26264 1727204244.36936: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 26264 1727204244.36941: variable 'omit' from source: magic vars 26264 1727204244.36969: variable 'omit' from source: magic vars 26264 1727204244.37001: variable 'item' from source: unknown 26264 1727204244.37051: variable 'item' from source: unknown 26264 1727204244.37061: variable 'omit' from source: magic vars 26264 1727204244.37090: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204244.37110: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204244.37124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204244.37137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204244.37146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204244.37170: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204244.37173: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204244.37175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204244.37241: Set connection var ansible_pipelining to False 26264 1727204244.37245: Set connection var ansible_connection to ssh 26264 1727204244.37249: Set connection var ansible_shell_type to sh 26264 1727204244.37252: Set connection var ansible_shell_executable to /bin/sh 26264 1727204244.37259: Set connection var ansible_timeout to 10 26264 1727204244.37266: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204244.37282: variable 'ansible_shell_executable' from source: unknown 26264 1727204244.37284: variable 'ansible_connection' from source: unknown 26264 1727204244.37287: variable 'ansible_module_compression' from source: unknown 26264 1727204244.37289: variable 'ansible_shell_type' from source: unknown 26264 1727204244.37291: variable 'ansible_shell_executable' from source: unknown 26264 1727204244.37293: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204244.37297: variable 'ansible_pipelining' from source: unknown 26264 1727204244.37301: variable 'ansible_timeout' from source: unknown 26264 1727204244.37303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204244.37397: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204244.37404: variable 'omit' from source: magic vars 26264 1727204244.37409: starting attempt loop 26264 1727204244.37413: running the handler 26264 1727204244.37426: _low_level_execute_command(): starting 26264 1727204244.37433: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204244.37919: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.37934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.37946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.37960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.37976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.38018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204244.38032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204244.38081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204244.39575: stdout chunk (state=3): >>>/root <<< 26264 1727204244.39678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204244.39724: stderr chunk (state=3): >>><<< 26264 1727204244.39727: stdout chunk (state=3): >>><<< 26264 1727204244.39744: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204244.39757: _low_level_execute_command(): starting 26264 1727204244.39769: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204244.3974426-26996-120062350592555 `" && echo ansible-tmp-1727204244.3974426-26996-120062350592555="` echo /root/.ansible/tmp/ansible-tmp-1727204244.3974426-26996-120062350592555 `" ) && sleep 0' 26264 1727204244.40181: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.40193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.40209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204244.40220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204244.40238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.40275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204244.40287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204244.40337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204244.42124: stdout chunk (state=3): >>>ansible-tmp-1727204244.3974426-26996-120062350592555=/root/.ansible/tmp/ansible-tmp-1727204244.3974426-26996-120062350592555 <<< 26264 1727204244.42238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204244.42309: stderr chunk (state=3): >>><<< 26264 1727204244.42312: stdout chunk (state=3): >>><<< 26264 1727204244.42473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204244.3974426-26996-120062350592555=/root/.ansible/tmp/ansible-tmp-1727204244.3974426-26996-120062350592555 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204244.42476: variable 'ansible_module_compression' from source: unknown 26264 1727204244.42478: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 26264 1727204244.42480: variable 'ansible_facts' from source: unknown 26264 1727204244.42574: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204244.3974426-26996-120062350592555/AnsiballZ_command.py 26264 1727204244.42725: Sending initial data 26264 1727204244.42728: Sent initial data (156 bytes) 26264 1727204244.43704: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204244.43719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.43735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.43754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.43804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.43817: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204244.43831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.43850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204244.43866: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204244.43880: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204244.43897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.43912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.43928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.43941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.43954: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204244.43970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.44053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204244.44078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204244.44096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204244.44171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204244.45829: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204244.45873: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204244.45910: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpaogckrsz /root/.ansible/tmp/ansible-tmp-1727204244.3974426-26996-120062350592555/AnsiballZ_command.py <<< 26264 1727204244.45943: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204244.46982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204244.47250: stderr chunk (state=3): >>><<< 26264 1727204244.47253: stdout chunk (state=3): >>><<< 26264 1727204244.47255: done transferring module to remote 26264 1727204244.47258: _low_level_execute_command(): starting 26264 1727204244.47260: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204244.3974426-26996-120062350592555/ /root/.ansible/tmp/ansible-tmp-1727204244.3974426-26996-120062350592555/AnsiballZ_command.py && sleep 0' 26264 1727204244.47859: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204244.47878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.47897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.47920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.47965: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.47978: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204244.47993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.48019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204244.48034: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204244.48048: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204244.48061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.48078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.48095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.48107: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.48124: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204244.48142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.48220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204244.48249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204244.48269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204244.48344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204244.50198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204244.50278: stderr chunk (state=3): >>><<< 26264 1727204244.50293: stdout chunk (state=3): >>><<< 26264 1727204244.50392: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204244.50396: _low_level_execute_command(): starting 26264 1727204244.50398: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204244.3974426-26996-120062350592555/AnsiballZ_command.py && sleep 0' 26264 1727204244.51003: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204244.51016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.51029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.51045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.51093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.51105: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204244.51118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.51135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204244.51148: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204244.51162: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204244.51209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.51247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.51265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.51284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.51302: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204244.51317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.51398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204244.51420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204244.51437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204244.51517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204244.65530: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-24 14:57:24.644029", "end": "2024-09-24 14:57:24.653947", "delta": "0:00:00.009918", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 26264 1727204244.67674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204244.67681: stdout chunk (state=3): >>><<< 26264 1727204244.67685: stderr chunk (state=3): >>><<< 26264 1727204244.67772: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-24 14:57:24.644029", "end": "2024-09-24 14:57:24.653947", "delta": "0:00:00.009918", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204244.67896: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr27 type veth peer name peerlsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204244.3974426-26996-120062350592555/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204244.67899: _low_level_execute_command(): starting 26264 1727204244.67902: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204244.3974426-26996-120062350592555/ > /dev/null 2>&1 && sleep 0' 26264 1727204244.69419: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.69423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.69445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.69577: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204244.69580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.69583: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.69585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.69644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204244.69783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204244.69787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204244.69851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204244.72304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204244.72387: stderr chunk (state=3): >>><<< 26264 1727204244.72570: stdout chunk (state=3): >>><<< 26264 1727204244.72574: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204244.72576: handler run complete 26264 1727204244.72578: Evaluated conditional (False): False 26264 1727204244.72580: attempt loop complete, returning result 26264 1727204244.72582: variable 'item' from source: unknown 26264 1727204244.72584: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link add lsr27 type veth peer name peerlsr27) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27" ], "delta": "0:00:00.009918", "end": "2024-09-24 14:57:24.653947", "item": "ip link add lsr27 type veth peer name peerlsr27", "rc": 0, "start": "2024-09-24 14:57:24.644029" } 26264 1727204244.72915: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204244.72919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204244.72922: variable 'omit' from source: magic vars 26264 1727204244.73005: variable 'ansible_distribution_major_version' from source: facts 26264 1727204244.73015: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204244.73203: variable 'type' from source: set_fact 26264 1727204244.73277: variable 'state' from source: include params 26264 1727204244.73286: variable 'interface' from source: set_fact 26264 1727204244.73293: variable 'current_interfaces' from source: set_fact 26264 1727204244.73302: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 26264 1727204244.73311: variable 'omit' from source: magic vars 26264 1727204244.73586: variable 'omit' from source: magic vars 26264 1727204244.73630: variable 'item' from source: unknown 26264 1727204244.73697: variable 'item' from source: unknown 26264 1727204244.73715: variable 'omit' from source: magic vars 26264 1727204244.73740: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204244.73752: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204244.73762: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204244.73783: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204244.73789: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204244.73796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204244.73874: Set connection var ansible_pipelining to False 26264 1727204244.73975: Set connection var ansible_connection to ssh 26264 1727204244.73982: Set connection var ansible_shell_type to sh 26264 1727204244.73991: Set connection var ansible_shell_executable to /bin/sh 26264 1727204244.74003: Set connection var ansible_timeout to 10 26264 1727204244.74013: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204244.74039: variable 'ansible_shell_executable' from source: unknown 26264 1727204244.74046: variable 'ansible_connection' from source: unknown 26264 1727204244.74052: variable 'ansible_module_compression' from source: unknown 26264 1727204244.74057: variable 'ansible_shell_type' from source: unknown 26264 1727204244.74063: variable 'ansible_shell_executable' from source: unknown 26264 1727204244.74071: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204244.74078: variable 'ansible_pipelining' from source: unknown 26264 1727204244.74083: variable 'ansible_timeout' from source: unknown 26264 1727204244.74177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204244.74272: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204244.74381: variable 'omit' from source: magic vars 26264 1727204244.74390: starting attempt loop 26264 1727204244.74396: running the handler 26264 1727204244.74406: _low_level_execute_command(): starting 26264 1727204244.74413: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204244.76054: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.76057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.76091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204244.76094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.76097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.76108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.76116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.76123: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204244.76132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.76214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204244.76234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204244.76246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204244.76325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204244.77820: stdout chunk (state=3): >>>/root <<< 26264 1727204244.77998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204244.78001: stdout chunk (state=3): >>><<< 26264 1727204244.78009: stderr chunk (state=3): >>><<< 26264 1727204244.78033: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204244.78044: _low_level_execute_command(): starting 26264 1727204244.78049: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204244.7803347-26996-129805179024861 `" && echo ansible-tmp-1727204244.7803347-26996-129805179024861="` echo /root/.ansible/tmp/ansible-tmp-1727204244.7803347-26996-129805179024861 `" ) && sleep 0' 26264 1727204244.79626: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204244.79641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.79659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.79679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.79720: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.79735: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204244.79751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.79776: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204244.79788: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204244.79800: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204244.79812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.79824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.79838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.79849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.79860: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204244.79876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.79951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204244.79976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204244.79995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204244.80067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204244.81869: stdout chunk (state=3): >>>ansible-tmp-1727204244.7803347-26996-129805179024861=/root/.ansible/tmp/ansible-tmp-1727204244.7803347-26996-129805179024861 <<< 26264 1727204244.82068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204244.82071: stdout chunk (state=3): >>><<< 26264 1727204244.82074: stderr chunk (state=3): >>><<< 26264 1727204244.82173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204244.7803347-26996-129805179024861=/root/.ansible/tmp/ansible-tmp-1727204244.7803347-26996-129805179024861 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204244.82180: variable 'ansible_module_compression' from source: unknown 26264 1727204244.82183: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 26264 1727204244.82345: variable 'ansible_facts' from source: unknown 26264 1727204244.82348: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204244.7803347-26996-129805179024861/AnsiballZ_command.py 26264 1727204244.82815: Sending initial data 26264 1727204244.82818: Sent initial data (156 bytes) 26264 1727204244.83768: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.83772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.83803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204244.83806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.83808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.83889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204244.83895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204244.83897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204244.83939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204244.85606: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204244.85650: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204244.85687: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpbbxviqya /root/.ansible/tmp/ansible-tmp-1727204244.7803347-26996-129805179024861/AnsiballZ_command.py <<< 26264 1727204244.85720: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204244.87021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204244.87147: stderr chunk (state=3): >>><<< 26264 1727204244.87151: stdout chunk (state=3): >>><<< 26264 1727204244.87153: done transferring module to remote 26264 1727204244.87155: _low_level_execute_command(): starting 26264 1727204244.87157: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204244.7803347-26996-129805179024861/ /root/.ansible/tmp/ansible-tmp-1727204244.7803347-26996-129805179024861/AnsiballZ_command.py && sleep 0' 26264 1727204244.87744: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204244.87758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.87775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.87794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.87838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.87850: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204244.87866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.87884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204244.87895: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204244.87911: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204244.87923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.87936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.87951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.87965: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.87976: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204244.87989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.88074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204244.88097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204244.88113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204244.88189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204244.89909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204244.89912: stdout chunk (state=3): >>><<< 26264 1727204244.89919: stderr chunk (state=3): >>><<< 26264 1727204244.90020: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204244.90023: _low_level_execute_command(): starting 26264 1727204244.90026: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204244.7803347-26996-129805179024861/AnsiballZ_command.py && sleep 0' 26264 1727204244.91374: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204244.91388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.91402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.91425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.91473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.91487: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204244.91503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.91519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204244.91531: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204244.91543: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204244.91557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204244.91572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204244.91586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204244.91597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204244.91607: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204244.91619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204244.91698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204244.91718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204244.91732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204244.91808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204245.05068: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-24 14:57:25.045978", "end": "2024-09-24 14:57:25.049488", "delta": "0:00:00.003510", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 26264 1727204245.06286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204245.06291: stdout chunk (state=3): >>><<< 26264 1727204245.06293: stderr chunk (state=3): >>><<< 26264 1727204245.06433: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-24 14:57:25.045978", "end": "2024-09-24 14:57:25.049488", "delta": "0:00:00.003510", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204245.06437: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204244.7803347-26996-129805179024861/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204245.06439: _low_level_execute_command(): starting 26264 1727204245.06442: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204244.7803347-26996-129805179024861/ > /dev/null 2>&1 && sleep 0' 26264 1727204245.07027: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204245.07042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.07061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.07084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.07137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.07153: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204245.07172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.07191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204245.07207: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204245.07219: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204245.07233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.07250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.07270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.07283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.07300: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204245.07639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.07724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204245.07746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204245.07768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204245.07853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204245.09656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204245.09660: stdout chunk (state=3): >>><<< 26264 1727204245.09663: stderr chunk (state=3): >>><<< 26264 1727204245.09873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204245.09881: handler run complete 26264 1727204245.09884: Evaluated conditional (False): False 26264 1727204245.09886: attempt loop complete, returning result 26264 1727204245.09888: variable 'item' from source: unknown 26264 1727204245.09890: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set peerlsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr27", "up" ], "delta": "0:00:00.003510", "end": "2024-09-24 14:57:25.049488", "item": "ip link set peerlsr27 up", "rc": 0, "start": "2024-09-24 14:57:25.045978" } 26264 1727204245.10070: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204245.10111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204245.10115: variable 'omit' from source: magic vars 26264 1727204245.10285: variable 'ansible_distribution_major_version' from source: facts 26264 1727204245.10296: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204245.10614: variable 'type' from source: set_fact 26264 1727204245.10624: variable 'state' from source: include params 26264 1727204245.10632: variable 'interface' from source: set_fact 26264 1727204245.10640: variable 'current_interfaces' from source: set_fact 26264 1727204245.10658: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 26264 1727204245.10669: variable 'omit' from source: magic vars 26264 1727204245.10785: variable 'omit' from source: magic vars 26264 1727204245.10828: variable 'item' from source: unknown 26264 1727204245.11014: variable 'item' from source: unknown 26264 1727204245.11033: variable 'omit' from source: magic vars 26264 1727204245.11062: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204245.11079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204245.11098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204245.11117: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204245.11206: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204245.11215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204245.11303: Set connection var ansible_pipelining to False 26264 1727204245.11429: Set connection var ansible_connection to ssh 26264 1727204245.11437: Set connection var ansible_shell_type to sh 26264 1727204245.11450: Set connection var ansible_shell_executable to /bin/sh 26264 1727204245.11465: Set connection var ansible_timeout to 10 26264 1727204245.11478: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204245.11503: variable 'ansible_shell_executable' from source: unknown 26264 1727204245.11510: variable 'ansible_connection' from source: unknown 26264 1727204245.11515: variable 'ansible_module_compression' from source: unknown 26264 1727204245.11521: variable 'ansible_shell_type' from source: unknown 26264 1727204245.11557: variable 'ansible_shell_executable' from source: unknown 26264 1727204245.11588: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204245.11619: variable 'ansible_pipelining' from source: unknown 26264 1727204245.11647: variable 'ansible_timeout' from source: unknown 26264 1727204245.11658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204245.11981: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204245.12055: variable 'omit' from source: magic vars 26264 1727204245.12069: starting attempt loop 26264 1727204245.12078: running the handler 26264 1727204245.12088: _low_level_execute_command(): starting 26264 1727204245.12107: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204245.12869: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204245.12884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.12898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.12916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.12968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.12982: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204245.12996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.13013: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204245.13025: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204245.13035: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204245.13051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.13070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.13088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.13101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.13112: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204245.13126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.13204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204245.13228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204245.13246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204245.13318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204245.14816: stdout chunk (state=3): >>>/root <<< 26264 1727204245.15904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204245.15992: stderr chunk (state=3): >>><<< 26264 1727204245.16078: stdout chunk (state=3): >>><<< 26264 1727204245.16185: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204245.16192: _low_level_execute_command(): starting 26264 1727204245.16196: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204245.16104-26996-36482902176911 `" && echo ansible-tmp-1727204245.16104-26996-36482902176911="` echo /root/.ansible/tmp/ansible-tmp-1727204245.16104-26996-36482902176911 `" ) && sleep 0' 26264 1727204245.17671: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204245.17685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.17699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.17720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.17768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.17840: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204245.17858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.17882: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204245.17893: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204245.17905: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204245.17918: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.17934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.17955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.17970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.17983: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204245.17997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.18190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204245.18213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204245.18231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204245.18306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204245.20107: stdout chunk (state=3): >>>ansible-tmp-1727204245.16104-26996-36482902176911=/root/.ansible/tmp/ansible-tmp-1727204245.16104-26996-36482902176911 <<< 26264 1727204245.20314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204245.20317: stdout chunk (state=3): >>><<< 26264 1727204245.20320: stderr chunk (state=3): >>><<< 26264 1727204245.20371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204245.16104-26996-36482902176911=/root/.ansible/tmp/ansible-tmp-1727204245.16104-26996-36482902176911 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204245.20374: variable 'ansible_module_compression' from source: unknown 26264 1727204245.20472: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 26264 1727204245.20475: variable 'ansible_facts' from source: unknown 26264 1727204245.20584: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204245.16104-26996-36482902176911/AnsiballZ_command.py 26264 1727204245.21190: Sending initial data 26264 1727204245.21193: Sent initial data (153 bytes) 26264 1727204245.23289: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204245.23296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.23310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.23324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.23364: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.23375: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204245.23385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.23399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204245.23402: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204245.23409: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204245.23463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.23477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.23489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.23496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.23503: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204245.23513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.23698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204245.23716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204245.23729: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204245.23801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204245.25471: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204245.25498: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204245.25536: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpf_zd_es8 /root/.ansible/tmp/ansible-tmp-1727204245.16104-26996-36482902176911/AnsiballZ_command.py <<< 26264 1727204245.25573: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204245.26385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204245.26490: stderr chunk (state=3): >>><<< 26264 1727204245.26493: stdout chunk (state=3): >>><<< 26264 1727204245.26509: done transferring module to remote 26264 1727204245.26516: _low_level_execute_command(): starting 26264 1727204245.26520: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204245.16104-26996-36482902176911/ /root/.ansible/tmp/ansible-tmp-1727204245.16104-26996-36482902176911/AnsiballZ_command.py && sleep 0' 26264 1727204245.27388: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204245.27397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.27406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.27420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.27481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.27488: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204245.27498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.27511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204245.27518: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204245.27525: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204245.27532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.27541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.27553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.27559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.27574: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204245.27583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.27656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204245.27675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204245.27692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204245.27785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204245.29440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204245.29487: stderr chunk (state=3): >>><<< 26264 1727204245.29489: stdout chunk (state=3): >>><<< 26264 1727204245.29518: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204245.29522: _low_level_execute_command(): starting 26264 1727204245.29524: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204245.16104-26996-36482902176911/AnsiballZ_command.py && sleep 0' 26264 1727204245.29924: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.29930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.29982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.29985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.29988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204245.30002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.30077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204245.30083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204245.30089: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204245.30138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204245.43693: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-24 14:57:25.429212", "end": "2024-09-24 14:57:25.435013", "delta": "0:00:00.005801", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 26264 1727204245.44887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204245.44891: stdout chunk (state=3): >>><<< 26264 1727204245.44893: stderr chunk (state=3): >>><<< 26264 1727204245.45050: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-24 14:57:25.429212", "end": "2024-09-24 14:57:25.435013", "delta": "0:00:00.005801", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204245.45059: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204245.16104-26996-36482902176911/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204245.45061: _low_level_execute_command(): starting 26264 1727204245.45066: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204245.16104-26996-36482902176911/ > /dev/null 2>&1 && sleep 0' 26264 1727204245.45658: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204245.45678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.45694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.45712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.45756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.45774: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204245.45790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.45808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204245.45820: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204245.45831: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204245.45843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.45857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.45875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.45890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.45900: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204245.45912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.45995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204245.46017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204245.46036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204245.46106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204245.47917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204245.47925: stdout chunk (state=3): >>><<< 26264 1727204245.47927: stderr chunk (state=3): >>><<< 26264 1727204245.48212: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204245.48216: handler run complete 26264 1727204245.48218: Evaluated conditional (False): False 26264 1727204245.48221: attempt loop complete, returning result 26264 1727204245.48222: variable 'item' from source: unknown 26264 1727204245.48224: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set lsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr27", "up" ], "delta": "0:00:00.005801", "end": "2024-09-24 14:57:25.435013", "item": "ip link set lsr27 up", "rc": 0, "start": "2024-09-24 14:57:25.429212" } 26264 1727204245.48324: dumping result to json 26264 1727204245.48327: done dumping result, returning 26264 1727204245.48330: done running TaskExecutor() for managed-node3/TASK: Create veth interface lsr27 [0affcd87-79f5-5ff5-08b0-000000000135] 26264 1727204245.48332: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000135 26264 1727204245.48388: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000135 26264 1727204245.48392: WORKER PROCESS EXITING 26264 1727204245.48462: no more pending results, returning what we have 26264 1727204245.48472: results queue empty 26264 1727204245.48473: checking for any_errors_fatal 26264 1727204245.48481: done checking for any_errors_fatal 26264 1727204245.48482: checking for max_fail_percentage 26264 1727204245.48484: done checking for max_fail_percentage 26264 1727204245.48485: checking to see if all hosts have failed and the running result is not ok 26264 1727204245.48487: done checking to see if all hosts have failed 26264 1727204245.48487: getting the remaining hosts for this loop 26264 1727204245.48489: done getting the remaining hosts for this loop 26264 1727204245.48493: getting the next task for host managed-node3 26264 1727204245.48499: done getting next task for host managed-node3 26264 1727204245.48502: ^ task is: TASK: Set up veth as managed by NetworkManager 26264 1727204245.48506: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204245.48510: getting variables 26264 1727204245.48512: in VariableManager get_vars() 26264 1727204245.48542: Calling all_inventory to load vars for managed-node3 26264 1727204245.48546: Calling groups_inventory to load vars for managed-node3 26264 1727204245.48549: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204245.48561: Calling all_plugins_play to load vars for managed-node3 26264 1727204245.48566: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204245.48570: Calling groups_plugins_play to load vars for managed-node3 26264 1727204245.48792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204245.49375: done with get_vars() 26264 1727204245.49386: done getting variables 26264 1727204245.49446: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:57:25 -0400 (0:00:01.140) 0:00:09.343 ***** 26264 1727204245.49483: entering _queue_task() for managed-node3/command 26264 1727204245.49724: worker is 1 (out of 1 available) 26264 1727204245.49735: exiting _queue_task() for managed-node3/command 26264 1727204245.49747: done queuing things up, now waiting for results queue to drain 26264 1727204245.49749: waiting for pending results... 26264 1727204245.50110: running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager 26264 1727204245.50207: in run() - task 0affcd87-79f5-5ff5-08b0-000000000136 26264 1727204245.50225: variable 'ansible_search_path' from source: unknown 26264 1727204245.50275: variable 'ansible_search_path' from source: unknown 26264 1727204245.50332: calling self._execute() 26264 1727204245.50501: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204245.50512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204245.50530: variable 'omit' from source: magic vars 26264 1727204245.50883: variable 'ansible_distribution_major_version' from source: facts 26264 1727204245.50901: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204245.51167: variable 'type' from source: set_fact 26264 1727204245.51307: variable 'state' from source: include params 26264 1727204245.51318: Evaluated conditional (type == 'veth' and state == 'present'): True 26264 1727204245.51328: variable 'omit' from source: magic vars 26264 1727204245.51369: variable 'omit' from source: magic vars 26264 1727204245.51476: variable 'interface' from source: set_fact 26264 1727204245.51498: variable 'omit' from source: magic vars 26264 1727204245.51545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204245.51587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204245.51612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204245.51640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204245.51657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204245.51749: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204245.51759: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204245.51770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204245.51925: Set connection var ansible_pipelining to False 26264 1727204245.51933: Set connection var ansible_connection to ssh 26264 1727204245.51938: Set connection var ansible_shell_type to sh 26264 1727204245.51952: Set connection var ansible_shell_executable to /bin/sh 26264 1727204245.51966: Set connection var ansible_timeout to 10 26264 1727204245.51978: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204245.52003: variable 'ansible_shell_executable' from source: unknown 26264 1727204245.52010: variable 'ansible_connection' from source: unknown 26264 1727204245.52016: variable 'ansible_module_compression' from source: unknown 26264 1727204245.52022: variable 'ansible_shell_type' from source: unknown 26264 1727204245.52029: variable 'ansible_shell_executable' from source: unknown 26264 1727204245.52036: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204245.52043: variable 'ansible_pipelining' from source: unknown 26264 1727204245.52048: variable 'ansible_timeout' from source: unknown 26264 1727204245.52060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204245.52202: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204245.52218: variable 'omit' from source: magic vars 26264 1727204245.52229: starting attempt loop 26264 1727204245.52235: running the handler 26264 1727204245.52255: _low_level_execute_command(): starting 26264 1727204245.52273: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204245.54087: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204245.54259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.54263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.54271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.54291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.54304: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204245.54317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.54336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204245.54345: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204245.54360: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204245.54377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.54392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.54409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.54424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.54442: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204245.54462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.54550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204245.54676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204245.54693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204245.54781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204245.56336: stdout chunk (state=3): >>>/root <<< 26264 1727204245.56571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204245.56574: stdout chunk (state=3): >>><<< 26264 1727204245.56577: stderr chunk (state=3): >>><<< 26264 1727204245.56581: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204245.56584: _low_level_execute_command(): starting 26264 1727204245.56593: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204245.5656314-27083-261990325566199 `" && echo ansible-tmp-1727204245.5656314-27083-261990325566199="` echo /root/.ansible/tmp/ansible-tmp-1727204245.5656314-27083-261990325566199 `" ) && sleep 0' 26264 1727204245.57340: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.57344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.57388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.57391: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.57394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.57396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.57454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204245.57510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204245.57518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204245.57629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204245.59380: stdout chunk (state=3): >>>ansible-tmp-1727204245.5656314-27083-261990325566199=/root/.ansible/tmp/ansible-tmp-1727204245.5656314-27083-261990325566199 <<< 26264 1727204245.59577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204245.59580: stdout chunk (state=3): >>><<< 26264 1727204245.59583: stderr chunk (state=3): >>><<< 26264 1727204245.59873: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204245.5656314-27083-261990325566199=/root/.ansible/tmp/ansible-tmp-1727204245.5656314-27083-261990325566199 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204245.59877: variable 'ansible_module_compression' from source: unknown 26264 1727204245.59879: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 26264 1727204245.59882: variable 'ansible_facts' from source: unknown 26264 1727204245.59884: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204245.5656314-27083-261990325566199/AnsiballZ_command.py 26264 1727204245.60474: Sending initial data 26264 1727204245.60477: Sent initial data (156 bytes) 26264 1727204245.63072: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204245.63089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.63103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.63120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.63174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.63256: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204245.63273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.63290: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204245.63301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204245.63311: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204245.63323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.63337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.63366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.63380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.63393: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204245.63408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.63604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204245.63630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204245.63651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204245.63736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204245.65429: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204245.65473: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204245.65507: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpnlk0i9zy /root/.ansible/tmp/ansible-tmp-1727204245.5656314-27083-261990325566199/AnsiballZ_command.py <<< 26264 1727204245.65543: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204245.66861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204245.66868: stderr chunk (state=3): >>><<< 26264 1727204245.66980: stdout chunk (state=3): >>><<< 26264 1727204245.66984: done transferring module to remote 26264 1727204245.66986: _low_level_execute_command(): starting 26264 1727204245.66989: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204245.5656314-27083-261990325566199/ /root/.ansible/tmp/ansible-tmp-1727204245.5656314-27083-261990325566199/AnsiballZ_command.py && sleep 0' 26264 1727204245.68355: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204245.68513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.68528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.68547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.68592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.68608: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204245.68629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.68646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204245.68661: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204245.68676: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204245.68689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.68702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.68721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.68738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.68750: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204245.68766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.68960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204245.68984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204245.69007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204245.69082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204245.70842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204245.70846: stdout chunk (state=3): >>><<< 26264 1727204245.70848: stderr chunk (state=3): >>><<< 26264 1727204245.70869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204245.70954: _low_level_execute_command(): starting 26264 1727204245.70957: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204245.5656314-27083-261990325566199/AnsiballZ_command.py && sleep 0' 26264 1727204245.72902: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204245.72982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.72998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.73024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.73107: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.73133: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204245.73184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.73243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204245.73257: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204245.73272: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204245.73311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.73330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.73350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.73362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.73375: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204245.73389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.73578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204245.73601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204245.73618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204245.73697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204245.88784: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-24 14:57:25.867106", "end": "2024-09-24 14:57:25.886356", "delta": "0:00:00.019250", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 26264 1727204245.90090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204245.90135: stderr chunk (state=3): >>><<< 26264 1727204245.90139: stdout chunk (state=3): >>><<< 26264 1727204245.90171: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-24 14:57:25.867106", "end": "2024-09-24 14:57:25.886356", "delta": "0:00:00.019250", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204245.90302: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr27 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204245.5656314-27083-261990325566199/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204245.90305: _low_level_execute_command(): starting 26264 1727204245.90308: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204245.5656314-27083-261990325566199/ > /dev/null 2>&1 && sleep 0' 26264 1727204245.91819: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204245.91963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.91978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.91992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.92034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.92041: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204245.92052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.92065: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204245.92072: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204245.92082: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204245.92090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204245.92099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204245.92110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204245.92117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204245.92123: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204245.92133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204245.92208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204245.92227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204245.92239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204245.92305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204245.94132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204245.94136: stdout chunk (state=3): >>><<< 26264 1727204245.94142: stderr chunk (state=3): >>><<< 26264 1727204245.94161: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204245.94171: handler run complete 26264 1727204245.94196: Evaluated conditional (False): False 26264 1727204245.94204: attempt loop complete, returning result 26264 1727204245.94207: _execute() done 26264 1727204245.94210: dumping result to json 26264 1727204245.94216: done dumping result, returning 26264 1727204245.94224: done running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager [0affcd87-79f5-5ff5-08b0-000000000136] 26264 1727204245.94229: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000136 26264 1727204245.94332: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000136 26264 1727204245.94335: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr27", "managed", "true" ], "delta": "0:00:00.019250", "end": "2024-09-24 14:57:25.886356", "rc": 0, "start": "2024-09-24 14:57:25.867106" } 26264 1727204245.94427: no more pending results, returning what we have 26264 1727204245.94431: results queue empty 26264 1727204245.94432: checking for any_errors_fatal 26264 1727204245.94443: done checking for any_errors_fatal 26264 1727204245.94444: checking for max_fail_percentage 26264 1727204245.94446: done checking for max_fail_percentage 26264 1727204245.94447: checking to see if all hosts have failed and the running result is not ok 26264 1727204245.94450: done checking to see if all hosts have failed 26264 1727204245.94451: getting the remaining hosts for this loop 26264 1727204245.94452: done getting the remaining hosts for this loop 26264 1727204245.94456: getting the next task for host managed-node3 26264 1727204245.94462: done getting next task for host managed-node3 26264 1727204245.94466: ^ task is: TASK: Delete veth interface {{ interface }} 26264 1727204245.94469: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204245.94473: getting variables 26264 1727204245.94474: in VariableManager get_vars() 26264 1727204245.94501: Calling all_inventory to load vars for managed-node3 26264 1727204245.94504: Calling groups_inventory to load vars for managed-node3 26264 1727204245.94507: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204245.94516: Calling all_plugins_play to load vars for managed-node3 26264 1727204245.94518: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204245.94521: Calling groups_plugins_play to load vars for managed-node3 26264 1727204245.94698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204245.94906: done with get_vars() 26264 1727204245.94916: done getting variables 26264 1727204245.94979: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 26264 1727204245.95911: variable 'interface' from source: set_fact TASK [Delete veth interface lsr27] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:57:25 -0400 (0:00:00.464) 0:00:09.808 ***** 26264 1727204245.95942: entering _queue_task() for managed-node3/command 26264 1727204245.96226: worker is 1 (out of 1 available) 26264 1727204245.96239: exiting _queue_task() for managed-node3/command 26264 1727204245.96255: done queuing things up, now waiting for results queue to drain 26264 1727204245.96256: waiting for pending results... 26264 1727204245.97017: running TaskExecutor() for managed-node3/TASK: Delete veth interface lsr27 26264 1727204245.97203: in run() - task 0affcd87-79f5-5ff5-08b0-000000000137 26264 1727204245.97337: variable 'ansible_search_path' from source: unknown 26264 1727204245.97345: variable 'ansible_search_path' from source: unknown 26264 1727204245.97388: calling self._execute() 26264 1727204245.97593: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204245.97606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204245.97619: variable 'omit' from source: magic vars 26264 1727204245.98233: variable 'ansible_distribution_major_version' from source: facts 26264 1727204245.98599: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204245.99476: variable 'type' from source: set_fact 26264 1727204245.99710: variable 'state' from source: include params 26264 1727204245.99754: variable 'interface' from source: set_fact 26264 1727204245.99811: variable 'current_interfaces' from source: set_fact 26264 1727204245.99824: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 26264 1727204245.99831: when evaluation is False, skipping this task 26264 1727204245.99841: _execute() done 26264 1727204245.99881: dumping result to json 26264 1727204245.99912: done dumping result, returning 26264 1727204245.99954: done running TaskExecutor() for managed-node3/TASK: Delete veth interface lsr27 [0affcd87-79f5-5ff5-08b0-000000000137] 26264 1727204245.99997: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000137 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 26264 1727204246.00380: no more pending results, returning what we have 26264 1727204246.00385: results queue empty 26264 1727204246.00386: checking for any_errors_fatal 26264 1727204246.00419: done checking for any_errors_fatal 26264 1727204246.00420: checking for max_fail_percentage 26264 1727204246.00422: done checking for max_fail_percentage 26264 1727204246.00422: checking to see if all hosts have failed and the running result is not ok 26264 1727204246.00424: done checking to see if all hosts have failed 26264 1727204246.00425: getting the remaining hosts for this loop 26264 1727204246.00427: done getting the remaining hosts for this loop 26264 1727204246.00441: getting the next task for host managed-node3 26264 1727204246.00449: done getting next task for host managed-node3 26264 1727204246.00452: ^ task is: TASK: Create dummy interface {{ interface }} 26264 1727204246.00455: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204246.00460: getting variables 26264 1727204246.00462: in VariableManager get_vars() 26264 1727204246.00619: Calling all_inventory to load vars for managed-node3 26264 1727204246.00647: Calling groups_inventory to load vars for managed-node3 26264 1727204246.00667: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.00725: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.00729: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.00735: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000137 26264 1727204246.00741: WORKER PROCESS EXITING 26264 1727204246.00786: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.01110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.02199: done with get_vars() 26264 1727204246.02230: done getting variables 26264 1727204246.02452: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 26264 1727204246.02681: variable 'interface' from source: set_fact TASK [Create dummy interface lsr27] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:57:26 -0400 (0:00:00.067) 0:00:09.876 ***** 26264 1727204246.02713: entering _queue_task() for managed-node3/command 26264 1727204246.03493: worker is 1 (out of 1 available) 26264 1727204246.03506: exiting _queue_task() for managed-node3/command 26264 1727204246.03633: done queuing things up, now waiting for results queue to drain 26264 1727204246.03635: waiting for pending results... 26264 1727204246.04336: running TaskExecutor() for managed-node3/TASK: Create dummy interface lsr27 26264 1727204246.04421: in run() - task 0affcd87-79f5-5ff5-08b0-000000000138 26264 1727204246.05086: variable 'ansible_search_path' from source: unknown 26264 1727204246.05095: variable 'ansible_search_path' from source: unknown 26264 1727204246.05137: calling self._execute() 26264 1727204246.05422: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.05487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.05511: variable 'omit' from source: magic vars 26264 1727204246.06570: variable 'ansible_distribution_major_version' from source: facts 26264 1727204246.06708: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204246.07257: variable 'type' from source: set_fact 26264 1727204246.07450: variable 'state' from source: include params 26264 1727204246.07511: variable 'interface' from source: set_fact 26264 1727204246.07528: variable 'current_interfaces' from source: set_fact 26264 1727204246.07540: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 26264 1727204246.07551: when evaluation is False, skipping this task 26264 1727204246.07559: _execute() done 26264 1727204246.07598: dumping result to json 26264 1727204246.07645: done dumping result, returning 26264 1727204246.07753: done running TaskExecutor() for managed-node3/TASK: Create dummy interface lsr27 [0affcd87-79f5-5ff5-08b0-000000000138] 26264 1727204246.07768: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000138 26264 1727204246.07894: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000138 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 26264 1727204246.07947: no more pending results, returning what we have 26264 1727204246.07951: results queue empty 26264 1727204246.07952: checking for any_errors_fatal 26264 1727204246.07958: done checking for any_errors_fatal 26264 1727204246.07959: checking for max_fail_percentage 26264 1727204246.07961: done checking for max_fail_percentage 26264 1727204246.07963: checking to see if all hosts have failed and the running result is not ok 26264 1727204246.07964: done checking to see if all hosts have failed 26264 1727204246.07968: getting the remaining hosts for this loop 26264 1727204246.07970: done getting the remaining hosts for this loop 26264 1727204246.07976: getting the next task for host managed-node3 26264 1727204246.07981: done getting next task for host managed-node3 26264 1727204246.07984: ^ task is: TASK: Delete dummy interface {{ interface }} 26264 1727204246.07987: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204246.07992: getting variables 26264 1727204246.07993: in VariableManager get_vars() 26264 1727204246.08022: Calling all_inventory to load vars for managed-node3 26264 1727204246.08025: Calling groups_inventory to load vars for managed-node3 26264 1727204246.08029: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.08042: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.08045: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.08047: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.08208: WORKER PROCESS EXITING 26264 1727204246.08231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.08435: done with get_vars() 26264 1727204246.08446: done getting variables 26264 1727204246.08509: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 26264 1727204246.08760: variable 'interface' from source: set_fact TASK [Delete dummy interface lsr27] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:57:26 -0400 (0:00:00.060) 0:00:09.937 ***** 26264 1727204246.08813: entering _queue_task() for managed-node3/command 26264 1727204246.09806: worker is 1 (out of 1 available) 26264 1727204246.09820: exiting _queue_task() for managed-node3/command 26264 1727204246.09834: done queuing things up, now waiting for results queue to drain 26264 1727204246.09835: waiting for pending results... 26264 1727204246.11009: running TaskExecutor() for managed-node3/TASK: Delete dummy interface lsr27 26264 1727204246.11407: in run() - task 0affcd87-79f5-5ff5-08b0-000000000139 26264 1727204246.11419: variable 'ansible_search_path' from source: unknown 26264 1727204246.11422: variable 'ansible_search_path' from source: unknown 26264 1727204246.11493: calling self._execute() 26264 1727204246.11754: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.11761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.11875: variable 'omit' from source: magic vars 26264 1727204246.13037: variable 'ansible_distribution_major_version' from source: facts 26264 1727204246.13293: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204246.14445: variable 'type' from source: set_fact 26264 1727204246.14468: variable 'state' from source: include params 26264 1727204246.14511: variable 'interface' from source: set_fact 26264 1727204246.14521: variable 'current_interfaces' from source: set_fact 26264 1727204246.14662: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 26264 1727204246.14672: when evaluation is False, skipping this task 26264 1727204246.14678: _execute() done 26264 1727204246.14685: dumping result to json 26264 1727204246.14692: done dumping result, returning 26264 1727204246.14703: done running TaskExecutor() for managed-node3/TASK: Delete dummy interface lsr27 [0affcd87-79f5-5ff5-08b0-000000000139] 26264 1727204246.14716: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000139 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 26264 1727204246.14868: no more pending results, returning what we have 26264 1727204246.14872: results queue empty 26264 1727204246.14873: checking for any_errors_fatal 26264 1727204246.14878: done checking for any_errors_fatal 26264 1727204246.14879: checking for max_fail_percentage 26264 1727204246.14881: done checking for max_fail_percentage 26264 1727204246.14881: checking to see if all hosts have failed and the running result is not ok 26264 1727204246.14882: done checking to see if all hosts have failed 26264 1727204246.14883: getting the remaining hosts for this loop 26264 1727204246.14884: done getting the remaining hosts for this loop 26264 1727204246.14889: getting the next task for host managed-node3 26264 1727204246.14895: done getting next task for host managed-node3 26264 1727204246.14898: ^ task is: TASK: Create tap interface {{ interface }} 26264 1727204246.14901: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204246.14905: getting variables 26264 1727204246.14907: in VariableManager get_vars() 26264 1727204246.14944: Calling all_inventory to load vars for managed-node3 26264 1727204246.14948: Calling groups_inventory to load vars for managed-node3 26264 1727204246.14952: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.14968: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.14971: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.14974: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.15192: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000139 26264 1727204246.15195: WORKER PROCESS EXITING 26264 1727204246.15217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.15552: done with get_vars() 26264 1727204246.15563: done getting variables 26264 1727204246.15860: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 26264 1727204246.16229: variable 'interface' from source: set_fact TASK [Create tap interface lsr27] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:57:26 -0400 (0:00:00.074) 0:00:10.012 ***** 26264 1727204246.16288: entering _queue_task() for managed-node3/command 26264 1727204246.17157: worker is 1 (out of 1 available) 26264 1727204246.17173: exiting _queue_task() for managed-node3/command 26264 1727204246.17185: done queuing things up, now waiting for results queue to drain 26264 1727204246.17187: waiting for pending results... 26264 1727204246.17413: running TaskExecutor() for managed-node3/TASK: Create tap interface lsr27 26264 1727204246.17824: in run() - task 0affcd87-79f5-5ff5-08b0-00000000013a 26264 1727204246.17908: variable 'ansible_search_path' from source: unknown 26264 1727204246.17917: variable 'ansible_search_path' from source: unknown 26264 1727204246.17962: calling self._execute() 26264 1727204246.18060: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.18065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.18076: variable 'omit' from source: magic vars 26264 1727204246.18752: variable 'ansible_distribution_major_version' from source: facts 26264 1727204246.18772: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204246.19045: variable 'type' from source: set_fact 26264 1727204246.19058: variable 'state' from source: include params 26264 1727204246.19069: variable 'interface' from source: set_fact 26264 1727204246.19077: variable 'current_interfaces' from source: set_fact 26264 1727204246.19088: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 26264 1727204246.19094: when evaluation is False, skipping this task 26264 1727204246.19099: _execute() done 26264 1727204246.19105: dumping result to json 26264 1727204246.19111: done dumping result, returning 26264 1727204246.19120: done running TaskExecutor() for managed-node3/TASK: Create tap interface lsr27 [0affcd87-79f5-5ff5-08b0-00000000013a] 26264 1727204246.19130: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000013a 26264 1727204246.19347: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000013a 26264 1727204246.19355: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 26264 1727204246.19423: no more pending results, returning what we have 26264 1727204246.19426: results queue empty 26264 1727204246.19427: checking for any_errors_fatal 26264 1727204246.19433: done checking for any_errors_fatal 26264 1727204246.19434: checking for max_fail_percentage 26264 1727204246.19435: done checking for max_fail_percentage 26264 1727204246.19436: checking to see if all hosts have failed and the running result is not ok 26264 1727204246.19437: done checking to see if all hosts have failed 26264 1727204246.19438: getting the remaining hosts for this loop 26264 1727204246.19439: done getting the remaining hosts for this loop 26264 1727204246.19443: getting the next task for host managed-node3 26264 1727204246.19451: done getting next task for host managed-node3 26264 1727204246.19454: ^ task is: TASK: Delete tap interface {{ interface }} 26264 1727204246.19457: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204246.19462: getting variables 26264 1727204246.19465: in VariableManager get_vars() 26264 1727204246.19497: Calling all_inventory to load vars for managed-node3 26264 1727204246.19500: Calling groups_inventory to load vars for managed-node3 26264 1727204246.19505: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.19520: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.19523: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.19526: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.19711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.19957: done with get_vars() 26264 1727204246.19970: done getting variables 26264 1727204246.20068: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 26264 1727204246.20230: variable 'interface' from source: set_fact TASK [Delete tap interface lsr27] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:57:26 -0400 (0:00:00.039) 0:00:10.051 ***** 26264 1727204246.20263: entering _queue_task() for managed-node3/command 26264 1727204246.20765: worker is 1 (out of 1 available) 26264 1727204246.20779: exiting _queue_task() for managed-node3/command 26264 1727204246.20792: done queuing things up, now waiting for results queue to drain 26264 1727204246.20794: waiting for pending results... 26264 1727204246.21486: running TaskExecutor() for managed-node3/TASK: Delete tap interface lsr27 26264 1727204246.21663: in run() - task 0affcd87-79f5-5ff5-08b0-00000000013b 26264 1727204246.21685: variable 'ansible_search_path' from source: unknown 26264 1727204246.21714: variable 'ansible_search_path' from source: unknown 26264 1727204246.21756: calling self._execute() 26264 1727204246.21911: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.21926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.21940: variable 'omit' from source: magic vars 26264 1727204246.22621: variable 'ansible_distribution_major_version' from source: facts 26264 1727204246.22741: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204246.23266: variable 'type' from source: set_fact 26264 1727204246.23277: variable 'state' from source: include params 26264 1727204246.23286: variable 'interface' from source: set_fact 26264 1727204246.23293: variable 'current_interfaces' from source: set_fact 26264 1727204246.23306: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 26264 1727204246.23312: when evaluation is False, skipping this task 26264 1727204246.23328: _execute() done 26264 1727204246.23352: dumping result to json 26264 1727204246.23366: done dumping result, returning 26264 1727204246.23397: done running TaskExecutor() for managed-node3/TASK: Delete tap interface lsr27 [0affcd87-79f5-5ff5-08b0-00000000013b] 26264 1727204246.23410: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000013b skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 26264 1727204246.23630: no more pending results, returning what we have 26264 1727204246.23633: results queue empty 26264 1727204246.23634: checking for any_errors_fatal 26264 1727204246.23641: done checking for any_errors_fatal 26264 1727204246.23642: checking for max_fail_percentage 26264 1727204246.23644: done checking for max_fail_percentage 26264 1727204246.23645: checking to see if all hosts have failed and the running result is not ok 26264 1727204246.23649: done checking to see if all hosts have failed 26264 1727204246.23650: getting the remaining hosts for this loop 26264 1727204246.23654: done getting the remaining hosts for this loop 26264 1727204246.23661: getting the next task for host managed-node3 26264 1727204246.23693: done getting next task for host managed-node3 26264 1727204246.23697: ^ task is: TASK: Include the task 'assert_device_present.yml' 26264 1727204246.23700: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204246.23705: getting variables 26264 1727204246.23707: in VariableManager get_vars() 26264 1727204246.23746: Calling all_inventory to load vars for managed-node3 26264 1727204246.23752: Calling groups_inventory to load vars for managed-node3 26264 1727204246.23756: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.23773: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.23776: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.23780: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.24081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.24286: done with get_vars() 26264 1727204246.24298: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:30 Tuesday 24 September 2024 14:57:26 -0400 (0:00:00.041) 0:00:10.093 ***** 26264 1727204246.24427: entering _queue_task() for managed-node3/include_tasks 26264 1727204246.24447: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000013b 26264 1727204246.24459: WORKER PROCESS EXITING 26264 1727204246.25038: worker is 1 (out of 1 available) 26264 1727204246.25054: exiting _queue_task() for managed-node3/include_tasks 26264 1727204246.25070: done queuing things up, now waiting for results queue to drain 26264 1727204246.25072: waiting for pending results... 26264 1727204246.25406: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' 26264 1727204246.25503: in run() - task 0affcd87-79f5-5ff5-08b0-000000000012 26264 1727204246.25524: variable 'ansible_search_path' from source: unknown 26264 1727204246.25568: calling self._execute() 26264 1727204246.25659: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.25674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.25688: variable 'omit' from source: magic vars 26264 1727204246.26133: variable 'ansible_distribution_major_version' from source: facts 26264 1727204246.26154: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204246.26167: _execute() done 26264 1727204246.26177: dumping result to json 26264 1727204246.26184: done dumping result, returning 26264 1727204246.26196: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' [0affcd87-79f5-5ff5-08b0-000000000012] 26264 1727204246.26207: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000012 26264 1727204246.26335: no more pending results, returning what we have 26264 1727204246.26340: in VariableManager get_vars() 26264 1727204246.26382: Calling all_inventory to load vars for managed-node3 26264 1727204246.26385: Calling groups_inventory to load vars for managed-node3 26264 1727204246.26389: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.26403: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.26407: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.26410: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.26613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.26816: done with get_vars() 26264 1727204246.26823: variable 'ansible_search_path' from source: unknown 26264 1727204246.26840: we have included files to process 26264 1727204246.26842: generating all_blocks data 26264 1727204246.26844: done generating all_blocks data 26264 1727204246.26854: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 26264 1727204246.26855: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 26264 1727204246.26858: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 26264 1727204246.27369: in VariableManager get_vars() 26264 1727204246.27386: done with get_vars() 26264 1727204246.27531: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000012 26264 1727204246.27535: WORKER PROCESS EXITING 26264 1727204246.27727: done processing included file 26264 1727204246.27729: iterating over new_blocks loaded from include file 26264 1727204246.27731: in VariableManager get_vars() 26264 1727204246.27742: done with get_vars() 26264 1727204246.27744: filtering new block on tags 26264 1727204246.27766: done filtering new block on tags 26264 1727204246.27768: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node3 26264 1727204246.27773: extending task lists for all hosts with included blocks 26264 1727204246.28461: done extending task lists 26264 1727204246.28462: done processing included files 26264 1727204246.28463: results queue empty 26264 1727204246.28465: checking for any_errors_fatal 26264 1727204246.28468: done checking for any_errors_fatal 26264 1727204246.28469: checking for max_fail_percentage 26264 1727204246.28470: done checking for max_fail_percentage 26264 1727204246.28471: checking to see if all hosts have failed and the running result is not ok 26264 1727204246.28472: done checking to see if all hosts have failed 26264 1727204246.28472: getting the remaining hosts for this loop 26264 1727204246.28474: done getting the remaining hosts for this loop 26264 1727204246.28476: getting the next task for host managed-node3 26264 1727204246.28480: done getting next task for host managed-node3 26264 1727204246.28482: ^ task is: TASK: Include the task 'get_interface_stat.yml' 26264 1727204246.28485: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204246.28487: getting variables 26264 1727204246.28488: in VariableManager get_vars() 26264 1727204246.28496: Calling all_inventory to load vars for managed-node3 26264 1727204246.28498: Calling groups_inventory to load vars for managed-node3 26264 1727204246.28500: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.28505: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.28507: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.28510: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.28647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.28843: done with get_vars() 26264 1727204246.28853: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:57:26 -0400 (0:00:00.045) 0:00:10.138 ***** 26264 1727204246.28925: entering _queue_task() for managed-node3/include_tasks 26264 1727204246.29352: worker is 1 (out of 1 available) 26264 1727204246.29363: exiting _queue_task() for managed-node3/include_tasks 26264 1727204246.29378: done queuing things up, now waiting for results queue to drain 26264 1727204246.29380: waiting for pending results... 26264 1727204246.29689: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 26264 1727204246.29789: in run() - task 0affcd87-79f5-5ff5-08b0-0000000001d3 26264 1727204246.29805: variable 'ansible_search_path' from source: unknown 26264 1727204246.29812: variable 'ansible_search_path' from source: unknown 26264 1727204246.29857: calling self._execute() 26264 1727204246.29940: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.29953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.29970: variable 'omit' from source: magic vars 26264 1727204246.30367: variable 'ansible_distribution_major_version' from source: facts 26264 1727204246.30387: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204246.30474: _execute() done 26264 1727204246.30484: dumping result to json 26264 1727204246.30491: done dumping result, returning 26264 1727204246.30500: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-5ff5-08b0-0000000001d3] 26264 1727204246.30510: sending task result for task 0affcd87-79f5-5ff5-08b0-0000000001d3 26264 1727204246.30623: no more pending results, returning what we have 26264 1727204246.30628: in VariableManager get_vars() 26264 1727204246.30668: Calling all_inventory to load vars for managed-node3 26264 1727204246.30671: Calling groups_inventory to load vars for managed-node3 26264 1727204246.30675: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.30689: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.30692: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.30695: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.30892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.31118: done with get_vars() 26264 1727204246.31124: variable 'ansible_search_path' from source: unknown 26264 1727204246.31125: variable 'ansible_search_path' from source: unknown 26264 1727204246.31168: we have included files to process 26264 1727204246.31169: generating all_blocks data 26264 1727204246.31171: done generating all_blocks data 26264 1727204246.31172: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 26264 1727204246.31173: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 26264 1727204246.31176: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 26264 1727204246.31533: done sending task result for task 0affcd87-79f5-5ff5-08b0-0000000001d3 26264 1727204246.31537: WORKER PROCESS EXITING 26264 1727204246.31636: done processing included file 26264 1727204246.31638: iterating over new_blocks loaded from include file 26264 1727204246.31640: in VariableManager get_vars() 26264 1727204246.31653: done with get_vars() 26264 1727204246.31655: filtering new block on tags 26264 1727204246.31670: done filtering new block on tags 26264 1727204246.31674: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 26264 1727204246.31679: extending task lists for all hosts with included blocks 26264 1727204246.31781: done extending task lists 26264 1727204246.31783: done processing included files 26264 1727204246.31784: results queue empty 26264 1727204246.31785: checking for any_errors_fatal 26264 1727204246.31788: done checking for any_errors_fatal 26264 1727204246.31788: checking for max_fail_percentage 26264 1727204246.31789: done checking for max_fail_percentage 26264 1727204246.31790: checking to see if all hosts have failed and the running result is not ok 26264 1727204246.31791: done checking to see if all hosts have failed 26264 1727204246.31792: getting the remaining hosts for this loop 26264 1727204246.31793: done getting the remaining hosts for this loop 26264 1727204246.31796: getting the next task for host managed-node3 26264 1727204246.31800: done getting next task for host managed-node3 26264 1727204246.31802: ^ task is: TASK: Get stat for interface {{ interface }} 26264 1727204246.31805: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204246.31807: getting variables 26264 1727204246.31808: in VariableManager get_vars() 26264 1727204246.31816: Calling all_inventory to load vars for managed-node3 26264 1727204246.31818: Calling groups_inventory to load vars for managed-node3 26264 1727204246.31820: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.31824: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.31827: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.31829: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.31972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.32419: done with get_vars() 26264 1727204246.32429: done getting variables 26264 1727204246.32585: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:57:26 -0400 (0:00:00.036) 0:00:10.175 ***** 26264 1727204246.32614: entering _queue_task() for managed-node3/stat 26264 1727204246.32841: worker is 1 (out of 1 available) 26264 1727204246.32857: exiting _queue_task() for managed-node3/stat 26264 1727204246.32870: done queuing things up, now waiting for results queue to drain 26264 1727204246.32872: waiting for pending results... 26264 1727204246.33334: running TaskExecutor() for managed-node3/TASK: Get stat for interface lsr27 26264 1727204246.33556: in run() - task 0affcd87-79f5-5ff5-08b0-00000000021e 26264 1727204246.33574: variable 'ansible_search_path' from source: unknown 26264 1727204246.33581: variable 'ansible_search_path' from source: unknown 26264 1727204246.33614: calling self._execute() 26264 1727204246.33723: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.33734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.33747: variable 'omit' from source: magic vars 26264 1727204246.34254: variable 'ansible_distribution_major_version' from source: facts 26264 1727204246.34274: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204246.34284: variable 'omit' from source: magic vars 26264 1727204246.34337: variable 'omit' from source: magic vars 26264 1727204246.34438: variable 'interface' from source: set_fact 26264 1727204246.34461: variable 'omit' from source: magic vars 26264 1727204246.34504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204246.34544: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204246.34573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204246.34594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204246.34609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204246.34646: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204246.34660: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.34671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.34880: Set connection var ansible_pipelining to False 26264 1727204246.34889: Set connection var ansible_connection to ssh 26264 1727204246.34896: Set connection var ansible_shell_type to sh 26264 1727204246.34906: Set connection var ansible_shell_executable to /bin/sh 26264 1727204246.34919: Set connection var ansible_timeout to 10 26264 1727204246.34932: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204246.34965: variable 'ansible_shell_executable' from source: unknown 26264 1727204246.34974: variable 'ansible_connection' from source: unknown 26264 1727204246.34979: variable 'ansible_module_compression' from source: unknown 26264 1727204246.34984: variable 'ansible_shell_type' from source: unknown 26264 1727204246.34990: variable 'ansible_shell_executable' from source: unknown 26264 1727204246.34995: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.35001: variable 'ansible_pipelining' from source: unknown 26264 1727204246.35006: variable 'ansible_timeout' from source: unknown 26264 1727204246.35011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.35268: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204246.35289: variable 'omit' from source: magic vars 26264 1727204246.35300: starting attempt loop 26264 1727204246.35306: running the handler 26264 1727204246.35322: _low_level_execute_command(): starting 26264 1727204246.35334: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204246.36086: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204246.36102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204246.36117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204246.36136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204246.36189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204246.36202: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204246.36216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.36235: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204246.36246: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204246.36265: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204246.36278: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204246.36293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204246.36310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204246.36324: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204246.36336: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204246.36354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.36510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204246.36535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204246.36557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204246.36642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204246.38258: stdout chunk (state=3): >>>/root <<< 26264 1727204246.38456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204246.38460: stdout chunk (state=3): >>><<< 26264 1727204246.38462: stderr chunk (state=3): >>><<< 26264 1727204246.38580: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204246.38583: _low_level_execute_command(): starting 26264 1727204246.38587: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204246.3848817-27188-129813053367269 `" && echo ansible-tmp-1727204246.3848817-27188-129813053367269="` echo /root/.ansible/tmp/ansible-tmp-1727204246.3848817-27188-129813053367269 `" ) && sleep 0' 26264 1727204246.39507: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204246.39511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204246.39547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204246.39553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204246.39556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.39629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204246.39632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204246.39637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204246.39691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204246.41530: stdout chunk (state=3): >>>ansible-tmp-1727204246.3848817-27188-129813053367269=/root/.ansible/tmp/ansible-tmp-1727204246.3848817-27188-129813053367269 <<< 26264 1727204246.41644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204246.41709: stderr chunk (state=3): >>><<< 26264 1727204246.41712: stdout chunk (state=3): >>><<< 26264 1727204246.41872: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204246.3848817-27188-129813053367269=/root/.ansible/tmp/ansible-tmp-1727204246.3848817-27188-129813053367269 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204246.41876: variable 'ansible_module_compression' from source: unknown 26264 1727204246.41878: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 26264 1727204246.41981: variable 'ansible_facts' from source: unknown 26264 1727204246.41994: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204246.3848817-27188-129813053367269/AnsiballZ_stat.py 26264 1727204246.42147: Sending initial data 26264 1727204246.42153: Sent initial data (153 bytes) 26264 1727204246.43156: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204246.43174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204246.43191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204246.43216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204246.43267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204246.43285: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204246.43300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.43319: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204246.43332: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204246.43343: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204246.43358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204246.43376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204246.43396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204246.43411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204246.43423: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204246.43437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.43521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204246.43541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204246.43557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204246.43629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204246.45314: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204246.45351: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204246.45404: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmp0ljpo4df /root/.ansible/tmp/ansible-tmp-1727204246.3848817-27188-129813053367269/AnsiballZ_stat.py <<< 26264 1727204246.45440: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204246.46817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204246.46954: stderr chunk (state=3): >>><<< 26264 1727204246.46958: stdout chunk (state=3): >>><<< 26264 1727204246.46960: done transferring module to remote 26264 1727204246.46963: _low_level_execute_command(): starting 26264 1727204246.46973: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204246.3848817-27188-129813053367269/ /root/.ansible/tmp/ansible-tmp-1727204246.3848817-27188-129813053367269/AnsiballZ_stat.py && sleep 0' 26264 1727204246.47732: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204246.47736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204246.47769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.47772: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204246.47783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.47853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204246.47857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204246.47928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204246.49612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204246.49702: stderr chunk (state=3): >>><<< 26264 1727204246.49706: stdout chunk (state=3): >>><<< 26264 1727204246.49808: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204246.49812: _low_level_execute_command(): starting 26264 1727204246.49814: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204246.3848817-27188-129813053367269/AnsiballZ_stat.py && sleep 0' 26264 1727204246.51330: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204246.51344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204246.51360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204246.51457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204246.51502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204246.51513: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204246.51533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.51601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204246.51641: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204246.51655: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204246.51667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204246.51682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204246.51695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204246.51705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204246.51713: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204246.51723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.51840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204246.51911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204246.51925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204246.52152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204246.64976: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30210, "dev": 21, "nlink": 1, "atime": 1727204244.6472993, "mtime": 1727204244.6472993, "ctime": 1727204244.6472993, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 26264 1727204246.66043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204246.66047: stdout chunk (state=3): >>><<< 26264 1727204246.66052: stderr chunk (state=3): >>><<< 26264 1727204246.66229: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30210, "dev": 21, "nlink": 1, "atime": 1727204244.6472993, "mtime": 1727204244.6472993, "ctime": 1727204244.6472993, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204246.66238: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204246.3848817-27188-129813053367269/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204246.66241: _low_level_execute_command(): starting 26264 1727204246.66244: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204246.3848817-27188-129813053367269/ > /dev/null 2>&1 && sleep 0' 26264 1727204246.67658: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204246.67781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204246.67798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204246.67889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204246.67935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204246.67951: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204246.67969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.67992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204246.68003: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204246.68013: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204246.68024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204246.68038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204246.68056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204246.68069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204246.68084: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204246.68211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.68282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204246.68310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204246.68353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204246.68431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204246.70308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204246.70312: stdout chunk (state=3): >>><<< 26264 1727204246.70314: stderr chunk (state=3): >>><<< 26264 1727204246.70370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204246.70374: handler run complete 26264 1727204246.70572: attempt loop complete, returning result 26264 1727204246.70575: _execute() done 26264 1727204246.70578: dumping result to json 26264 1727204246.70580: done dumping result, returning 26264 1727204246.70582: done running TaskExecutor() for managed-node3/TASK: Get stat for interface lsr27 [0affcd87-79f5-5ff5-08b0-00000000021e] 26264 1727204246.70584: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000021e 26264 1727204246.70663: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000021e 26264 1727204246.70672: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204244.6472993, "block_size": 4096, "blocks": 0, "ctime": 1727204244.6472993, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30210, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "mode": "0777", "mtime": 1727204244.6472993, "nlink": 1, "path": "/sys/class/net/lsr27", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 26264 1727204246.70767: no more pending results, returning what we have 26264 1727204246.70772: results queue empty 26264 1727204246.70773: checking for any_errors_fatal 26264 1727204246.70774: done checking for any_errors_fatal 26264 1727204246.70775: checking for max_fail_percentage 26264 1727204246.70777: done checking for max_fail_percentage 26264 1727204246.70778: checking to see if all hosts have failed and the running result is not ok 26264 1727204246.70779: done checking to see if all hosts have failed 26264 1727204246.70780: getting the remaining hosts for this loop 26264 1727204246.70782: done getting the remaining hosts for this loop 26264 1727204246.70786: getting the next task for host managed-node3 26264 1727204246.70794: done getting next task for host managed-node3 26264 1727204246.70797: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 26264 1727204246.70800: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204246.70805: getting variables 26264 1727204246.70806: in VariableManager get_vars() 26264 1727204246.70837: Calling all_inventory to load vars for managed-node3 26264 1727204246.70845: Calling groups_inventory to load vars for managed-node3 26264 1727204246.70852: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.70865: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.70869: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.70872: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.71273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.71581: done with get_vars() 26264 1727204246.71592: done getting variables 26264 1727204246.71895: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 26264 1727204246.72022: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'lsr27'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:57:26 -0400 (0:00:00.394) 0:00:10.569 ***** 26264 1727204246.72059: entering _queue_task() for managed-node3/assert 26264 1727204246.72061: Creating lock for assert 26264 1727204246.72675: worker is 1 (out of 1 available) 26264 1727204246.72686: exiting _queue_task() for managed-node3/assert 26264 1727204246.72699: done queuing things up, now waiting for results queue to drain 26264 1727204246.72701: waiting for pending results... 26264 1727204246.73595: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'lsr27' 26264 1727204246.73839: in run() - task 0affcd87-79f5-5ff5-08b0-0000000001d4 26264 1727204246.73976: variable 'ansible_search_path' from source: unknown 26264 1727204246.73985: variable 'ansible_search_path' from source: unknown 26264 1727204246.74025: calling self._execute() 26264 1727204246.74123: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.74179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.74300: variable 'omit' from source: magic vars 26264 1727204246.75071: variable 'ansible_distribution_major_version' from source: facts 26264 1727204246.75090: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204246.75100: variable 'omit' from source: magic vars 26264 1727204246.75143: variable 'omit' from source: magic vars 26264 1727204246.75359: variable 'interface' from source: set_fact 26264 1727204246.75499: variable 'omit' from source: magic vars 26264 1727204246.75546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204246.75596: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204246.75623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204246.75703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204246.75721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204246.75758: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204246.75813: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.75821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.76133: Set connection var ansible_pipelining to False 26264 1727204246.76142: Set connection var ansible_connection to ssh 26264 1727204246.76152: Set connection var ansible_shell_type to sh 26264 1727204246.76166: Set connection var ansible_shell_executable to /bin/sh 26264 1727204246.76181: Set connection var ansible_timeout to 10 26264 1727204246.76192: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204246.76222: variable 'ansible_shell_executable' from source: unknown 26264 1727204246.76229: variable 'ansible_connection' from source: unknown 26264 1727204246.76239: variable 'ansible_module_compression' from source: unknown 26264 1727204246.76245: variable 'ansible_shell_type' from source: unknown 26264 1727204246.76254: variable 'ansible_shell_executable' from source: unknown 26264 1727204246.76260: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.76269: variable 'ansible_pipelining' from source: unknown 26264 1727204246.76355: variable 'ansible_timeout' from source: unknown 26264 1727204246.76367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.76628: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204246.76646: variable 'omit' from source: magic vars 26264 1727204246.76662: starting attempt loop 26264 1727204246.76786: running the handler 26264 1727204246.76937: variable 'interface_stat' from source: set_fact 26264 1727204246.77019: Evaluated conditional (interface_stat.stat.exists): True 26264 1727204246.77078: handler run complete 26264 1727204246.77099: attempt loop complete, returning result 26264 1727204246.77109: _execute() done 26264 1727204246.77115: dumping result to json 26264 1727204246.77220: done dumping result, returning 26264 1727204246.77233: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'lsr27' [0affcd87-79f5-5ff5-08b0-0000000001d4] 26264 1727204246.77243: sending task result for task 0affcd87-79f5-5ff5-08b0-0000000001d4 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 26264 1727204246.77400: no more pending results, returning what we have 26264 1727204246.77404: results queue empty 26264 1727204246.77405: checking for any_errors_fatal 26264 1727204246.77412: done checking for any_errors_fatal 26264 1727204246.77413: checking for max_fail_percentage 26264 1727204246.77415: done checking for max_fail_percentage 26264 1727204246.77415: checking to see if all hosts have failed and the running result is not ok 26264 1727204246.77417: done checking to see if all hosts have failed 26264 1727204246.77418: getting the remaining hosts for this loop 26264 1727204246.77420: done getting the remaining hosts for this loop 26264 1727204246.77424: getting the next task for host managed-node3 26264 1727204246.77432: done getting next task for host managed-node3 26264 1727204246.77434: ^ task is: TASK: meta (flush_handlers) 26264 1727204246.77437: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204246.77442: getting variables 26264 1727204246.77444: in VariableManager get_vars() 26264 1727204246.77480: Calling all_inventory to load vars for managed-node3 26264 1727204246.77484: Calling groups_inventory to load vars for managed-node3 26264 1727204246.77488: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.77500: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.77503: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.77506: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.77700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.77894: done with get_vars() 26264 1727204246.77906: done getting variables 26264 1727204246.77982: in VariableManager get_vars() 26264 1727204246.77991: Calling all_inventory to load vars for managed-node3 26264 1727204246.77993: Calling groups_inventory to load vars for managed-node3 26264 1727204246.77995: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.78000: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.78002: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.78005: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.78653: done sending task result for task 0affcd87-79f5-5ff5-08b0-0000000001d4 26264 1727204246.78657: WORKER PROCESS EXITING 26264 1727204246.78680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.79076: done with get_vars() 26264 1727204246.79091: done queuing things up, now waiting for results queue to drain 26264 1727204246.79093: results queue empty 26264 1727204246.79093: checking for any_errors_fatal 26264 1727204246.79096: done checking for any_errors_fatal 26264 1727204246.79097: checking for max_fail_percentage 26264 1727204246.79098: done checking for max_fail_percentage 26264 1727204246.79099: checking to see if all hosts have failed and the running result is not ok 26264 1727204246.79100: done checking to see if all hosts have failed 26264 1727204246.79106: getting the remaining hosts for this loop 26264 1727204246.79107: done getting the remaining hosts for this loop 26264 1727204246.79110: getting the next task for host managed-node3 26264 1727204246.79113: done getting next task for host managed-node3 26264 1727204246.79115: ^ task is: TASK: meta (flush_handlers) 26264 1727204246.79116: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204246.79119: getting variables 26264 1727204246.79120: in VariableManager get_vars() 26264 1727204246.79127: Calling all_inventory to load vars for managed-node3 26264 1727204246.79130: Calling groups_inventory to load vars for managed-node3 26264 1727204246.79132: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.79137: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.79139: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.79142: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.79290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.79886: done with get_vars() 26264 1727204246.79894: done getting variables 26264 1727204246.79939: in VariableManager get_vars() 26264 1727204246.79950: Calling all_inventory to load vars for managed-node3 26264 1727204246.79952: Calling groups_inventory to load vars for managed-node3 26264 1727204246.79955: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.79959: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.79961: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.79965: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.80353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.81137: done with get_vars() 26264 1727204246.81151: done queuing things up, now waiting for results queue to drain 26264 1727204246.81154: results queue empty 26264 1727204246.81154: checking for any_errors_fatal 26264 1727204246.81156: done checking for any_errors_fatal 26264 1727204246.81156: checking for max_fail_percentage 26264 1727204246.81157: done checking for max_fail_percentage 26264 1727204246.81158: checking to see if all hosts have failed and the running result is not ok 26264 1727204246.81159: done checking to see if all hosts have failed 26264 1727204246.81159: getting the remaining hosts for this loop 26264 1727204246.81160: done getting the remaining hosts for this loop 26264 1727204246.81163: getting the next task for host managed-node3 26264 1727204246.81370: done getting next task for host managed-node3 26264 1727204246.81372: ^ task is: None 26264 1727204246.81373: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204246.81375: done queuing things up, now waiting for results queue to drain 26264 1727204246.81376: results queue empty 26264 1727204246.81376: checking for any_errors_fatal 26264 1727204246.81377: done checking for any_errors_fatal 26264 1727204246.81378: checking for max_fail_percentage 26264 1727204246.81379: done checking for max_fail_percentage 26264 1727204246.81380: checking to see if all hosts have failed and the running result is not ok 26264 1727204246.81380: done checking to see if all hosts have failed 26264 1727204246.81382: getting the next task for host managed-node3 26264 1727204246.81384: done getting next task for host managed-node3 26264 1727204246.81385: ^ task is: None 26264 1727204246.81386: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204246.81428: in VariableManager get_vars() 26264 1727204246.81452: done with get_vars() 26264 1727204246.81458: in VariableManager get_vars() 26264 1727204246.81472: done with get_vars() 26264 1727204246.81476: variable 'omit' from source: magic vars 26264 1727204246.81505: in VariableManager get_vars() 26264 1727204246.81516: done with get_vars() 26264 1727204246.81536: variable 'omit' from source: magic vars PLAY [Test static interface up] ************************************************ 26264 1727204246.82893: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 26264 1727204246.83580: getting the remaining hosts for this loop 26264 1727204246.83581: done getting the remaining hosts for this loop 26264 1727204246.83584: getting the next task for host managed-node3 26264 1727204246.83587: done getting next task for host managed-node3 26264 1727204246.83589: ^ task is: TASK: Gathering Facts 26264 1727204246.83590: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204246.83592: getting variables 26264 1727204246.83593: in VariableManager get_vars() 26264 1727204246.83607: Calling all_inventory to load vars for managed-node3 26264 1727204246.83609: Calling groups_inventory to load vars for managed-node3 26264 1727204246.83612: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204246.83617: Calling all_plugins_play to load vars for managed-node3 26264 1727204246.83620: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204246.83623: Calling groups_plugins_play to load vars for managed-node3 26264 1727204246.83773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204246.84004: done with get_vars() 26264 1727204246.84013: done getting variables 26264 1727204246.84058: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Tuesday 24 September 2024 14:57:26 -0400 (0:00:00.120) 0:00:10.690 ***** 26264 1727204246.84084: entering _queue_task() for managed-node3/gather_facts 26264 1727204246.84577: worker is 1 (out of 1 available) 26264 1727204246.84589: exiting _queue_task() for managed-node3/gather_facts 26264 1727204246.84600: done queuing things up, now waiting for results queue to drain 26264 1727204246.84602: waiting for pending results... 26264 1727204246.85376: running TaskExecutor() for managed-node3/TASK: Gathering Facts 26264 1727204246.85478: in run() - task 0affcd87-79f5-5ff5-08b0-000000000237 26264 1727204246.85590: variable 'ansible_search_path' from source: unknown 26264 1727204246.85631: calling self._execute() 26264 1727204246.85835: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.85847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.85868: variable 'omit' from source: magic vars 26264 1727204246.86686: variable 'ansible_distribution_major_version' from source: facts 26264 1727204246.86736: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204246.86750: variable 'omit' from source: magic vars 26264 1727204246.86781: variable 'omit' from source: magic vars 26264 1727204246.86945: variable 'omit' from source: magic vars 26264 1727204246.86995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204246.87035: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204246.87181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204246.87204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204246.87221: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204246.87257: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204246.87376: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.87384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.87601: Set connection var ansible_pipelining to False 26264 1727204246.87609: Set connection var ansible_connection to ssh 26264 1727204246.87616: Set connection var ansible_shell_type to sh 26264 1727204246.87627: Set connection var ansible_shell_executable to /bin/sh 26264 1727204246.87639: Set connection var ansible_timeout to 10 26264 1727204246.87653: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204246.87683: variable 'ansible_shell_executable' from source: unknown 26264 1727204246.87691: variable 'ansible_connection' from source: unknown 26264 1727204246.87700: variable 'ansible_module_compression' from source: unknown 26264 1727204246.87707: variable 'ansible_shell_type' from source: unknown 26264 1727204246.87713: variable 'ansible_shell_executable' from source: unknown 26264 1727204246.87718: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204246.87725: variable 'ansible_pipelining' from source: unknown 26264 1727204246.87731: variable 'ansible_timeout' from source: unknown 26264 1727204246.87738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204246.88160: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204246.88178: variable 'omit' from source: magic vars 26264 1727204246.88188: starting attempt loop 26264 1727204246.88193: running the handler 26264 1727204246.88259: variable 'ansible_facts' from source: unknown 26264 1727204246.88290: _low_level_execute_command(): starting 26264 1727204246.88361: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204246.90218: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204246.90313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204246.90329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204246.90350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204246.90396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204246.90410: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204246.90424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.90444: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204246.90459: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204246.90473: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204246.90484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204246.90497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204246.90511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204246.90526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204246.90536: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204246.90553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.90696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204246.90759: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204246.90780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204246.90861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204246.92478: stdout chunk (state=3): >>>/root <<< 26264 1727204246.92586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204246.92699: stderr chunk (state=3): >>><<< 26264 1727204246.92702: stdout chunk (state=3): >>><<< 26264 1727204246.92821: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204246.92824: _low_level_execute_command(): starting 26264 1727204246.92827: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204246.9272864-27247-53586494408117 `" && echo ansible-tmp-1727204246.9272864-27247-53586494408117="` echo /root/.ansible/tmp/ansible-tmp-1727204246.9272864-27247-53586494408117 `" ) && sleep 0' 26264 1727204246.95231: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204246.95262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204246.95282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204246.95301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204246.95345: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204246.95493: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204246.95604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.95622: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204246.95634: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204246.95644: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204246.95659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204246.95676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204246.95694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204246.95711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204246.95721: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204246.95734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204246.95818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204246.95941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204246.95958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204246.96097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204246.97886: stdout chunk (state=3): >>>ansible-tmp-1727204246.9272864-27247-53586494408117=/root/.ansible/tmp/ansible-tmp-1727204246.9272864-27247-53586494408117 <<< 26264 1727204246.98369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204246.98373: stdout chunk (state=3): >>><<< 26264 1727204246.98376: stderr chunk (state=3): >>><<< 26264 1727204246.98378: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204246.9272864-27247-53586494408117=/root/.ansible/tmp/ansible-tmp-1727204246.9272864-27247-53586494408117 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204246.98381: variable 'ansible_module_compression' from source: unknown 26264 1727204246.98383: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 26264 1727204246.98385: variable 'ansible_facts' from source: unknown 26264 1727204246.98525: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204246.9272864-27247-53586494408117/AnsiballZ_setup.py 26264 1727204247.00495: Sending initial data 26264 1727204247.00506: Sent initial data (153 bytes) 26264 1727204247.03064: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204247.03069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204247.03099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204247.03184: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204247.03202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204247.03219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204247.03231: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204247.03241: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204247.03252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204247.03267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204247.03283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204247.03308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204247.03320: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204247.03335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204247.03580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204247.03601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204247.03616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204247.03742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204247.05383: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204247.05422: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204247.05461: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmp9dqq98rm /root/.ansible/tmp/ansible-tmp-1727204246.9272864-27247-53586494408117/AnsiballZ_setup.py <<< 26264 1727204247.05499: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204247.08599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204247.08723: stderr chunk (state=3): >>><<< 26264 1727204247.08726: stdout chunk (state=3): >>><<< 26264 1727204247.08729: done transferring module to remote 26264 1727204247.08731: _low_level_execute_command(): starting 26264 1727204247.08733: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204246.9272864-27247-53586494408117/ /root/.ansible/tmp/ansible-tmp-1727204246.9272864-27247-53586494408117/AnsiballZ_setup.py && sleep 0' 26264 1727204247.10323: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204247.10355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204247.10383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204247.10405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204247.10463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204247.10486: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204247.10509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204247.10537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204247.10554: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204247.10569: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204247.10590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204247.10607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204247.10633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204247.10646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204247.10671: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204247.10690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204247.10779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204247.10893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204247.10914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204247.11002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204247.12789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204247.12792: stdout chunk (state=3): >>><<< 26264 1727204247.12794: stderr chunk (state=3): >>><<< 26264 1727204247.12871: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204247.12875: _low_level_execute_command(): starting 26264 1727204247.12877: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204246.9272864-27247-53586494408117/AnsiballZ_setup.py && sleep 0' 26264 1727204247.14469: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204247.14476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204247.14529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204247.14551: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204247.14555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204247.14628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204247.14631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204247.14637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204247.14720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204247.66119: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "27", "epoch": "1727204247", "epoch_int": "1727204247", "date": "2024-09-24", "time": "14:57:27", "iso8601_micro": "2024-09-24T18:57:27.370995Z", "iso8601": "2024-09-24T18:57:27Z", "iso8601_basic": "20240924T145727370995", "iso8601_basic_short": "20240924T145727", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_loadavg": {"1m": 0.51, "5m": 0.38, "15m": 0.2}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "peerlsr27", "lo", "lsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:14:6b:cc:14:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c014:6bff:fecc:144a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "2a:10:ff:43:10:2c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b318:e6ab:cc76:c51b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::c014:6bff:fecc:144a", "fe80::8ff:f5ff:fed7:be93", "fe80::b318:e6ab:cc76:c51b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93", "fe80::b318:e6ab:cc76:c51b", "fe80::c014:6bff:fecc:144a"]}, "ansible_proce<<< 26264 1727204247.66131: stdout chunk (state=3): >>>ssor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2750, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 782, "free": 2750}, "nocache": {"free": 3209, "used": 323}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 593, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264280002560, "block_size": 4096, "block_total": 65519355, "block_available": 64521485, "block_used": 997870, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 26264 1727204247.67739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204247.67822: stderr chunk (state=3): >>><<< 26264 1727204247.67826: stdout chunk (state=3): >>><<< 26264 1727204247.67975: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "27", "epoch": "1727204247", "epoch_int": "1727204247", "date": "2024-09-24", "time": "14:57:27", "iso8601_micro": "2024-09-24T18:57:27.370995Z", "iso8601": "2024-09-24T18:57:27Z", "iso8601_basic": "20240924T145727370995", "iso8601_basic_short": "20240924T145727", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_loadavg": {"1m": 0.51, "5m": 0.38, "15m": 0.2}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "peerlsr27", "lo", "lsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:14:6b:cc:14:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c014:6bff:fecc:144a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "2a:10:ff:43:10:2c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b318:e6ab:cc76:c51b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::c014:6bff:fecc:144a", "fe80::8ff:f5ff:fed7:be93", "fe80::b318:e6ab:cc76:c51b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93", "fe80::b318:e6ab:cc76:c51b", "fe80::c014:6bff:fecc:144a"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2750, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 782, "free": 2750}, "nocache": {"free": 3209, "used": 323}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 593, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264280002560, "block_size": 4096, "block_total": 65519355, "block_available": 64521485, "block_used": 997870, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204247.68472: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204246.9272864-27247-53586494408117/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204247.68497: _low_level_execute_command(): starting 26264 1727204247.68506: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204246.9272864-27247-53586494408117/ > /dev/null 2>&1 && sleep 0' 26264 1727204247.70971: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204247.71199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204247.71218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204247.71239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204247.71296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204247.71414: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204247.71430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204247.71452: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204247.71469: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204247.71482: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204247.71522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204247.71537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204247.71559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204247.71575: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204247.71588: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204247.71602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204247.71795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204247.71860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204247.71880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204247.71966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204247.73833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204247.73838: stdout chunk (state=3): >>><<< 26264 1727204247.73840: stderr chunk (state=3): >>><<< 26264 1727204247.74072: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204247.74075: handler run complete 26264 1727204247.74078: variable 'ansible_facts' from source: unknown 26264 1727204247.74160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204247.74588: variable 'ansible_facts' from source: unknown 26264 1727204247.74933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204247.75267: attempt loop complete, returning result 26264 1727204247.75274: _execute() done 26264 1727204247.75277: dumping result to json 26264 1727204247.75322: done dumping result, returning 26264 1727204247.75329: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-5ff5-08b0-000000000237] 26264 1727204247.75335: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000237 26264 1727204247.75796: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000237 26264 1727204247.75800: WORKER PROCESS EXITING ok: [managed-node3] 26264 1727204247.76129: no more pending results, returning what we have 26264 1727204247.76134: results queue empty 26264 1727204247.76134: checking for any_errors_fatal 26264 1727204247.76136: done checking for any_errors_fatal 26264 1727204247.76136: checking for max_fail_percentage 26264 1727204247.76138: done checking for max_fail_percentage 26264 1727204247.76139: checking to see if all hosts have failed and the running result is not ok 26264 1727204247.76141: done checking to see if all hosts have failed 26264 1727204247.76142: getting the remaining hosts for this loop 26264 1727204247.76144: done getting the remaining hosts for this loop 26264 1727204247.76148: getting the next task for host managed-node3 26264 1727204247.76153: done getting next task for host managed-node3 26264 1727204247.76155: ^ task is: TASK: meta (flush_handlers) 26264 1727204247.76158: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204247.76161: getting variables 26264 1727204247.76163: in VariableManager get_vars() 26264 1727204247.76201: Calling all_inventory to load vars for managed-node3 26264 1727204247.76205: Calling groups_inventory to load vars for managed-node3 26264 1727204247.76208: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204247.76219: Calling all_plugins_play to load vars for managed-node3 26264 1727204247.76221: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204247.76224: Calling groups_plugins_play to load vars for managed-node3 26264 1727204247.76432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204247.77072: done with get_vars() 26264 1727204247.77084: done getting variables 26264 1727204247.77152: in VariableManager get_vars() 26264 1727204247.77223: Calling all_inventory to load vars for managed-node3 26264 1727204247.77227: Calling groups_inventory to load vars for managed-node3 26264 1727204247.77229: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204247.77235: Calling all_plugins_play to load vars for managed-node3 26264 1727204247.77237: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204247.77245: Calling groups_plugins_play to load vars for managed-node3 26264 1727204247.77454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204247.77732: done with get_vars() 26264 1727204247.77748: done queuing things up, now waiting for results queue to drain 26264 1727204247.77751: results queue empty 26264 1727204247.77751: checking for any_errors_fatal 26264 1727204247.77755: done checking for any_errors_fatal 26264 1727204247.77755: checking for max_fail_percentage 26264 1727204247.77756: done checking for max_fail_percentage 26264 1727204247.77757: checking to see if all hosts have failed and the running result is not ok 26264 1727204247.77758: done checking to see if all hosts have failed 26264 1727204247.77758: getting the remaining hosts for this loop 26264 1727204247.77759: done getting the remaining hosts for this loop 26264 1727204247.77762: getting the next task for host managed-node3 26264 1727204247.77767: done getting next task for host managed-node3 26264 1727204247.77770: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 26264 1727204247.77772: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204247.77782: getting variables 26264 1727204247.77782: in VariableManager get_vars() 26264 1727204247.77799: Calling all_inventory to load vars for managed-node3 26264 1727204247.77802: Calling groups_inventory to load vars for managed-node3 26264 1727204247.77804: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204247.77809: Calling all_plugins_play to load vars for managed-node3 26264 1727204247.77811: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204247.77814: Calling groups_plugins_play to load vars for managed-node3 26264 1727204247.78129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204247.78516: done with get_vars() 26264 1727204247.78524: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.945) 0:00:11.635 ***** 26264 1727204247.78604: entering _queue_task() for managed-node3/include_tasks 26264 1727204247.78940: worker is 1 (out of 1 available) 26264 1727204247.78952: exiting _queue_task() for managed-node3/include_tasks 26264 1727204247.78967: done queuing things up, now waiting for results queue to drain 26264 1727204247.78969: waiting for pending results... 26264 1727204247.79325: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 26264 1727204247.79500: in run() - task 0affcd87-79f5-5ff5-08b0-000000000019 26264 1727204247.79538: variable 'ansible_search_path' from source: unknown 26264 1727204247.79579: variable 'ansible_search_path' from source: unknown 26264 1727204247.79702: calling self._execute() 26264 1727204247.79817: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204247.79829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204247.79843: variable 'omit' from source: magic vars 26264 1727204247.80323: variable 'ansible_distribution_major_version' from source: facts 26264 1727204247.80341: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204247.80356: _execute() done 26264 1727204247.80366: dumping result to json 26264 1727204247.80374: done dumping result, returning 26264 1727204247.80385: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-5ff5-08b0-000000000019] 26264 1727204247.80394: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000019 26264 1727204247.80503: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000019 26264 1727204247.80512: WORKER PROCESS EXITING 26264 1727204247.80559: no more pending results, returning what we have 26264 1727204247.80568: in VariableManager get_vars() 26264 1727204247.80614: Calling all_inventory to load vars for managed-node3 26264 1727204247.80618: Calling groups_inventory to load vars for managed-node3 26264 1727204247.80620: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204247.80633: Calling all_plugins_play to load vars for managed-node3 26264 1727204247.80635: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204247.80638: Calling groups_plugins_play to load vars for managed-node3 26264 1727204247.80862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204247.81398: done with get_vars() 26264 1727204247.81407: variable 'ansible_search_path' from source: unknown 26264 1727204247.81408: variable 'ansible_search_path' from source: unknown 26264 1727204247.81437: we have included files to process 26264 1727204247.81438: generating all_blocks data 26264 1727204247.81439: done generating all_blocks data 26264 1727204247.81440: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26264 1727204247.81441: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26264 1727204247.81444: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26264 1727204247.82455: done processing included file 26264 1727204247.82457: iterating over new_blocks loaded from include file 26264 1727204247.82459: in VariableManager get_vars() 26264 1727204247.82483: done with get_vars() 26264 1727204247.82485: filtering new block on tags 26264 1727204247.82502: done filtering new block on tags 26264 1727204247.82504: in VariableManager get_vars() 26264 1727204247.82524: done with get_vars() 26264 1727204247.82525: filtering new block on tags 26264 1727204247.82543: done filtering new block on tags 26264 1727204247.82545: in VariableManager get_vars() 26264 1727204247.82567: done with get_vars() 26264 1727204247.82569: filtering new block on tags 26264 1727204247.82589: done filtering new block on tags 26264 1727204247.82591: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 26264 1727204247.82596: extending task lists for all hosts with included blocks 26264 1727204247.83017: done extending task lists 26264 1727204247.83019: done processing included files 26264 1727204247.83021: results queue empty 26264 1727204247.83021: checking for any_errors_fatal 26264 1727204247.83023: done checking for any_errors_fatal 26264 1727204247.83024: checking for max_fail_percentage 26264 1727204247.83025: done checking for max_fail_percentage 26264 1727204247.83026: checking to see if all hosts have failed and the running result is not ok 26264 1727204247.83027: done checking to see if all hosts have failed 26264 1727204247.83028: getting the remaining hosts for this loop 26264 1727204247.83029: done getting the remaining hosts for this loop 26264 1727204247.83032: getting the next task for host managed-node3 26264 1727204247.83036: done getting next task for host managed-node3 26264 1727204247.83039: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 26264 1727204247.83041: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204247.83050: getting variables 26264 1727204247.83051: in VariableManager get_vars() 26264 1727204247.83068: Calling all_inventory to load vars for managed-node3 26264 1727204247.83070: Calling groups_inventory to load vars for managed-node3 26264 1727204247.83072: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204247.83078: Calling all_plugins_play to load vars for managed-node3 26264 1727204247.83080: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204247.83083: Calling groups_plugins_play to load vars for managed-node3 26264 1727204247.83293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204247.83670: done with get_vars() 26264 1727204247.83685: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.051) 0:00:11.686 ***** 26264 1727204247.83770: entering _queue_task() for managed-node3/setup 26264 1727204247.84360: worker is 1 (out of 1 available) 26264 1727204247.84419: exiting _queue_task() for managed-node3/setup 26264 1727204247.84432: done queuing things up, now waiting for results queue to drain 26264 1727204247.84433: waiting for pending results... 26264 1727204247.85834: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 26264 1727204247.86219: in run() - task 0affcd87-79f5-5ff5-08b0-000000000279 26264 1727204247.86241: variable 'ansible_search_path' from source: unknown 26264 1727204247.86332: variable 'ansible_search_path' from source: unknown 26264 1727204247.86378: calling self._execute() 26264 1727204247.86532: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204247.86668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204247.86684: variable 'omit' from source: magic vars 26264 1727204247.87404: variable 'ansible_distribution_major_version' from source: facts 26264 1727204247.87541: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204247.88974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204247.94559: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204247.94876: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204247.94931: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204247.94984: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204247.95017: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204247.95242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204247.95308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204247.95366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204247.95437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204247.96313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204247.96377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204247.96440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204247.96481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204247.96565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204247.96587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204247.96790: variable '__network_required_facts' from source: role '' defaults 26264 1727204247.96806: variable 'ansible_facts' from source: unknown 26264 1727204247.96926: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 26264 1727204247.96935: when evaluation is False, skipping this task 26264 1727204247.96942: _execute() done 26264 1727204247.96952: dumping result to json 26264 1727204247.96963: done dumping result, returning 26264 1727204247.96984: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-5ff5-08b0-000000000279] 26264 1727204247.96995: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000279 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204247.97151: no more pending results, returning what we have 26264 1727204247.97156: results queue empty 26264 1727204247.97157: checking for any_errors_fatal 26264 1727204247.97158: done checking for any_errors_fatal 26264 1727204247.97159: checking for max_fail_percentage 26264 1727204247.97160: done checking for max_fail_percentage 26264 1727204247.97161: checking to see if all hosts have failed and the running result is not ok 26264 1727204247.97162: done checking to see if all hosts have failed 26264 1727204247.97163: getting the remaining hosts for this loop 26264 1727204247.97166: done getting the remaining hosts for this loop 26264 1727204247.97171: getting the next task for host managed-node3 26264 1727204247.97179: done getting next task for host managed-node3 26264 1727204247.97184: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 26264 1727204247.97187: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204247.97202: getting variables 26264 1727204247.97204: in VariableManager get_vars() 26264 1727204247.97245: Calling all_inventory to load vars for managed-node3 26264 1727204247.97251: Calling groups_inventory to load vars for managed-node3 26264 1727204247.97254: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204247.97268: Calling all_plugins_play to load vars for managed-node3 26264 1727204247.97272: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204247.97275: Calling groups_plugins_play to load vars for managed-node3 26264 1727204247.97498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204247.97917: done with get_vars() 26264 1727204247.97930: done getting variables 26264 1727204247.98016: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000279 26264 1727204247.98019: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.143) 0:00:11.830 ***** 26264 1727204247.98106: entering _queue_task() for managed-node3/stat 26264 1727204247.98843: worker is 1 (out of 1 available) 26264 1727204247.98859: exiting _queue_task() for managed-node3/stat 26264 1727204247.98873: done queuing things up, now waiting for results queue to drain 26264 1727204247.98967: waiting for pending results... 26264 1727204247.99857: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 26264 1727204248.00257: in run() - task 0affcd87-79f5-5ff5-08b0-00000000027b 26264 1727204248.00284: variable 'ansible_search_path' from source: unknown 26264 1727204248.00293: variable 'ansible_search_path' from source: unknown 26264 1727204248.00455: calling self._execute() 26264 1727204248.00559: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204248.00657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204248.00677: variable 'omit' from source: magic vars 26264 1727204248.01369: variable 'ansible_distribution_major_version' from source: facts 26264 1727204248.01485: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204248.01899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204248.02574: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204248.02624: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204248.02746: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204248.02795: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204248.03046: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204248.03091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204248.03123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204248.03207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204248.03379: variable '__network_is_ostree' from source: set_fact 26264 1727204248.03507: Evaluated conditional (not __network_is_ostree is defined): False 26264 1727204248.03518: when evaluation is False, skipping this task 26264 1727204248.03525: _execute() done 26264 1727204248.03533: dumping result to json 26264 1727204248.03540: done dumping result, returning 26264 1727204248.03554: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-5ff5-08b0-00000000027b] 26264 1727204248.03567: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000027b 26264 1727204248.03704: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000027b skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 26264 1727204248.03762: no more pending results, returning what we have 26264 1727204248.03768: results queue empty 26264 1727204248.03769: checking for any_errors_fatal 26264 1727204248.03774: done checking for any_errors_fatal 26264 1727204248.03774: checking for max_fail_percentage 26264 1727204248.03776: done checking for max_fail_percentage 26264 1727204248.03777: checking to see if all hosts have failed and the running result is not ok 26264 1727204248.03778: done checking to see if all hosts have failed 26264 1727204248.03779: getting the remaining hosts for this loop 26264 1727204248.03781: done getting the remaining hosts for this loop 26264 1727204248.03785: getting the next task for host managed-node3 26264 1727204248.03791: done getting next task for host managed-node3 26264 1727204248.03796: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 26264 1727204248.03799: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204248.03812: getting variables 26264 1727204248.03814: in VariableManager get_vars() 26264 1727204248.03856: Calling all_inventory to load vars for managed-node3 26264 1727204248.03859: Calling groups_inventory to load vars for managed-node3 26264 1727204248.03862: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204248.03874: Calling all_plugins_play to load vars for managed-node3 26264 1727204248.03878: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204248.03881: Calling groups_plugins_play to load vars for managed-node3 26264 1727204248.04183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204248.04721: done with get_vars() 26264 1727204248.04735: done getting variables 26264 1727204248.04946: WORKER PROCESS EXITING 26264 1727204248.04989: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:57:28 -0400 (0:00:00.069) 0:00:11.899 ***** 26264 1727204248.05023: entering _queue_task() for managed-node3/set_fact 26264 1727204248.05612: worker is 1 (out of 1 available) 26264 1727204248.05624: exiting _queue_task() for managed-node3/set_fact 26264 1727204248.05638: done queuing things up, now waiting for results queue to drain 26264 1727204248.05640: waiting for pending results... 26264 1727204248.05911: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 26264 1727204248.06093: in run() - task 0affcd87-79f5-5ff5-08b0-00000000027c 26264 1727204248.06113: variable 'ansible_search_path' from source: unknown 26264 1727204248.06121: variable 'ansible_search_path' from source: unknown 26264 1727204248.06205: calling self._execute() 26264 1727204248.06291: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204248.06307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204248.06321: variable 'omit' from source: magic vars 26264 1727204248.06735: variable 'ansible_distribution_major_version' from source: facts 26264 1727204248.06758: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204248.06937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204248.07310: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204248.07366: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204248.07407: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204248.07451: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204248.07537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204248.07576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204248.07612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204248.07646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204248.07743: variable '__network_is_ostree' from source: set_fact 26264 1727204248.07759: Evaluated conditional (not __network_is_ostree is defined): False 26264 1727204248.07775: when evaluation is False, skipping this task 26264 1727204248.07783: _execute() done 26264 1727204248.07791: dumping result to json 26264 1727204248.07799: done dumping result, returning 26264 1727204248.07810: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-5ff5-08b0-00000000027c] 26264 1727204248.07824: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000027c skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 26264 1727204248.07973: no more pending results, returning what we have 26264 1727204248.07977: results queue empty 26264 1727204248.07978: checking for any_errors_fatal 26264 1727204248.07983: done checking for any_errors_fatal 26264 1727204248.07984: checking for max_fail_percentage 26264 1727204248.07986: done checking for max_fail_percentage 26264 1727204248.07987: checking to see if all hosts have failed and the running result is not ok 26264 1727204248.07988: done checking to see if all hosts have failed 26264 1727204248.07989: getting the remaining hosts for this loop 26264 1727204248.07991: done getting the remaining hosts for this loop 26264 1727204248.07996: getting the next task for host managed-node3 26264 1727204248.08004: done getting next task for host managed-node3 26264 1727204248.08009: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 26264 1727204248.08012: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204248.08028: getting variables 26264 1727204248.08030: in VariableManager get_vars() 26264 1727204248.08072: Calling all_inventory to load vars for managed-node3 26264 1727204248.08076: Calling groups_inventory to load vars for managed-node3 26264 1727204248.08079: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204248.08090: Calling all_plugins_play to load vars for managed-node3 26264 1727204248.08094: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204248.08097: Calling groups_plugins_play to load vars for managed-node3 26264 1727204248.08630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204248.09588: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000027c 26264 1727204248.09591: WORKER PROCESS EXITING 26264 1727204248.09778: done with get_vars() 26264 1727204248.09982: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:57:28 -0400 (0:00:00.054) 0:00:11.954 ***** 26264 1727204248.10512: entering _queue_task() for managed-node3/service_facts 26264 1727204248.10514: Creating lock for service_facts 26264 1727204248.10884: worker is 1 (out of 1 available) 26264 1727204248.10897: exiting _queue_task() for managed-node3/service_facts 26264 1727204248.10910: done queuing things up, now waiting for results queue to drain 26264 1727204248.10911: waiting for pending results... 26264 1727204248.11809: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 26264 1727204248.12419: in run() - task 0affcd87-79f5-5ff5-08b0-00000000027e 26264 1727204248.12503: variable 'ansible_search_path' from source: unknown 26264 1727204248.12524: variable 'ansible_search_path' from source: unknown 26264 1727204248.12659: calling self._execute() 26264 1727204248.13236: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204248.13296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204248.13362: variable 'omit' from source: magic vars 26264 1727204248.14234: variable 'ansible_distribution_major_version' from source: facts 26264 1727204248.14285: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204248.14296: variable 'omit' from source: magic vars 26264 1727204248.14590: variable 'omit' from source: magic vars 26264 1727204248.14869: variable 'omit' from source: magic vars 26264 1727204248.14998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204248.15302: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204248.15398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204248.15497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204248.15625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204248.15880: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204248.15889: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204248.15896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204248.16318: Set connection var ansible_pipelining to False 26264 1727204248.16327: Set connection var ansible_connection to ssh 26264 1727204248.16334: Set connection var ansible_shell_type to sh 26264 1727204248.16344: Set connection var ansible_shell_executable to /bin/sh 26264 1727204248.16362: Set connection var ansible_timeout to 10 26264 1727204248.16427: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204248.16463: variable 'ansible_shell_executable' from source: unknown 26264 1727204248.16476: variable 'ansible_connection' from source: unknown 26264 1727204248.16483: variable 'ansible_module_compression' from source: unknown 26264 1727204248.16490: variable 'ansible_shell_type' from source: unknown 26264 1727204248.16500: variable 'ansible_shell_executable' from source: unknown 26264 1727204248.16507: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204248.16515: variable 'ansible_pipelining' from source: unknown 26264 1727204248.16522: variable 'ansible_timeout' from source: unknown 26264 1727204248.16530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204248.17118: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204248.17135: variable 'omit' from source: magic vars 26264 1727204248.17258: starting attempt loop 26264 1727204248.17270: running the handler 26264 1727204248.17287: _low_level_execute_command(): starting 26264 1727204248.17299: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204248.18999: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204248.19017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204248.19032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204248.19055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204248.19101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204248.19116: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204248.19131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204248.19153: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204248.19169: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204248.19181: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204248.19194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204248.19274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204248.19373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204248.19436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204248.19512: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204248.19566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204248.19750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204248.19809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204248.19893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204248.20046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204248.21834: stdout chunk (state=3): >>>/root <<< 26264 1727204248.21925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204248.22225: stderr chunk (state=3): >>><<< 26264 1727204248.22245: stdout chunk (state=3): >>><<< 26264 1727204248.22423: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204248.22427: _low_level_execute_command(): starting 26264 1727204248.22431: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204248.2232914-27373-181241072565918 `" && echo ansible-tmp-1727204248.2232914-27373-181241072565918="` echo /root/.ansible/tmp/ansible-tmp-1727204248.2232914-27373-181241072565918 `" ) && sleep 0' 26264 1727204248.24132: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204248.24147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204248.24168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204248.24187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204248.24230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204248.24242: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204248.24259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204248.24280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204248.24292: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204248.24304: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204248.24316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204248.24330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204248.24345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204248.24360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204248.24376: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204248.24422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204248.24503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204248.24526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204248.24544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204248.24655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204248.26433: stdout chunk (state=3): >>>ansible-tmp-1727204248.2232914-27373-181241072565918=/root/.ansible/tmp/ansible-tmp-1727204248.2232914-27373-181241072565918 <<< 26264 1727204248.26640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204248.26992: stderr chunk (state=3): >>><<< 26264 1727204248.26995: stdout chunk (state=3): >>><<< 26264 1727204248.27031: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204248.2232914-27373-181241072565918=/root/.ansible/tmp/ansible-tmp-1727204248.2232914-27373-181241072565918 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204248.27114: variable 'ansible_module_compression' from source: unknown 26264 1727204248.27256: ANSIBALLZ: Using lock for service_facts 26264 1727204248.27268: ANSIBALLZ: Acquiring lock 26264 1727204248.27271: ANSIBALLZ: Lock acquired: 139841024715680 26264 1727204248.27274: ANSIBALLZ: Creating module 26264 1727204248.79604: ANSIBALLZ: Writing module into payload 26264 1727204248.79970: ANSIBALLZ: Writing module 26264 1727204248.80219: ANSIBALLZ: Renaming module 26264 1727204248.80231: ANSIBALLZ: Done creating module 26264 1727204248.80258: variable 'ansible_facts' from source: unknown 26264 1727204248.80463: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204248.2232914-27373-181241072565918/AnsiballZ_service_facts.py 26264 1727204248.81095: Sending initial data 26264 1727204248.81105: Sent initial data (162 bytes) 26264 1727204248.84029: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204248.84055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204248.84075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204248.84156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204248.84204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204248.84216: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204248.84230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204248.84258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204248.84366: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204248.84381: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204248.84395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204248.84410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204248.84427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204248.84440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204248.84460: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204248.85223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204248.85360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204248.85452: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204248.85471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204248.85614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204248.87445: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204248.87490: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204248.87503: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpep2fceid /root/.ansible/tmp/ansible-tmp-1727204248.2232914-27373-181241072565918/AnsiballZ_service_facts.py <<< 26264 1727204248.87544: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204248.89222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204248.89594: stderr chunk (state=3): >>><<< 26264 1727204248.89598: stdout chunk (state=3): >>><<< 26264 1727204248.89601: done transferring module to remote 26264 1727204248.89607: _low_level_execute_command(): starting 26264 1727204248.89610: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204248.2232914-27373-181241072565918/ /root/.ansible/tmp/ansible-tmp-1727204248.2232914-27373-181241072565918/AnsiballZ_service_facts.py && sleep 0' 26264 1727204248.91450: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204248.91469: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204248.91484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204248.91501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204248.91555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204248.91570: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204248.91584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204248.91602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204248.91613: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204248.91624: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204248.91644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204248.91663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204248.91761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204248.91777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204248.91788: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204248.91802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204248.91887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204248.91986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204248.92003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204248.92085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204248.93900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204248.93904: stdout chunk (state=3): >>><<< 26264 1727204248.93907: stderr chunk (state=3): >>><<< 26264 1727204248.94010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204248.94014: _low_level_execute_command(): starting 26264 1727204248.94017: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204248.2232914-27373-181241072565918/AnsiballZ_service_facts.py && sleep 0' 26264 1727204248.95488: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204248.95492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204248.95532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204248.95536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204248.95538: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204248.95715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204248.95788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204248.95951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204250.22800: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stop<<< 26264 1727204250.22833: stdout chunk (state=3): >>>ped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtim<<< 26264 1727204250.22837: stdout chunk (state=3): >>>e-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 26264 1727204250.24041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204250.24122: stderr chunk (state=3): >>><<< 26264 1727204250.24125: stdout chunk (state=3): >>><<< 26264 1727204250.24174: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204250.29277: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204248.2232914-27373-181241072565918/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204250.29282: _low_level_execute_command(): starting 26264 1727204250.29285: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204248.2232914-27373-181241072565918/ > /dev/null 2>&1 && sleep 0' 26264 1727204250.30476: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204250.30641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204250.32406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204250.32412: stderr chunk (state=3): >>><<< 26264 1727204250.32416: stdout chunk (state=3): >>><<< 26264 1727204250.32442: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204250.32446: handler run complete 26264 1727204250.32628: variable 'ansible_facts' from source: unknown 26264 1727204250.32772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204250.33302: variable 'ansible_facts' from source: unknown 26264 1727204250.33446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204250.33763: attempt loop complete, returning result 26264 1727204250.33769: _execute() done 26264 1727204250.33771: dumping result to json 26264 1727204250.33934: done dumping result, returning 26264 1727204250.33943: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-5ff5-08b0-00000000027e] 26264 1727204250.33952: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000027e 26264 1727204250.34678: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000027e 26264 1727204250.34682: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204250.34739: no more pending results, returning what we have 26264 1727204250.34744: results queue empty 26264 1727204250.34745: checking for any_errors_fatal 26264 1727204250.34749: done checking for any_errors_fatal 26264 1727204250.34750: checking for max_fail_percentage 26264 1727204250.34752: done checking for max_fail_percentage 26264 1727204250.34753: checking to see if all hosts have failed and the running result is not ok 26264 1727204250.34754: done checking to see if all hosts have failed 26264 1727204250.34754: getting the remaining hosts for this loop 26264 1727204250.34756: done getting the remaining hosts for this loop 26264 1727204250.34761: getting the next task for host managed-node3 26264 1727204250.34767: done getting next task for host managed-node3 26264 1727204250.34770: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 26264 1727204250.34774: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204250.34784: getting variables 26264 1727204250.34786: in VariableManager get_vars() 26264 1727204250.35044: Calling all_inventory to load vars for managed-node3 26264 1727204250.35048: Calling groups_inventory to load vars for managed-node3 26264 1727204250.35050: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204250.35099: Calling all_plugins_play to load vars for managed-node3 26264 1727204250.35103: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204250.35107: Calling groups_plugins_play to load vars for managed-node3 26264 1727204250.35810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204250.36682: done with get_vars() 26264 1727204250.36697: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:57:30 -0400 (0:00:02.262) 0:00:14.217 ***** 26264 1727204250.36798: entering _queue_task() for managed-node3/package_facts 26264 1727204250.36800: Creating lock for package_facts 26264 1727204250.37091: worker is 1 (out of 1 available) 26264 1727204250.37106: exiting _queue_task() for managed-node3/package_facts 26264 1727204250.37119: done queuing things up, now waiting for results queue to drain 26264 1727204250.37120: waiting for pending results... 26264 1727204250.37406: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 26264 1727204250.37555: in run() - task 0affcd87-79f5-5ff5-08b0-00000000027f 26264 1727204250.37580: variable 'ansible_search_path' from source: unknown 26264 1727204250.37589: variable 'ansible_search_path' from source: unknown 26264 1727204250.37629: calling self._execute() 26264 1727204250.37722: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204250.37733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204250.37750: variable 'omit' from source: magic vars 26264 1727204250.38170: variable 'ansible_distribution_major_version' from source: facts 26264 1727204250.38189: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204250.38201: variable 'omit' from source: magic vars 26264 1727204250.38265: variable 'omit' from source: magic vars 26264 1727204250.38310: variable 'omit' from source: magic vars 26264 1727204250.38814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204250.38883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204250.38914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204250.38958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204250.38991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204250.39051: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204250.39073: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204250.39084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204250.39273: Set connection var ansible_pipelining to False 26264 1727204250.39288: Set connection var ansible_connection to ssh 26264 1727204250.39300: Set connection var ansible_shell_type to sh 26264 1727204250.39323: Set connection var ansible_shell_executable to /bin/sh 26264 1727204250.39387: Set connection var ansible_timeout to 10 26264 1727204250.39406: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204250.39439: variable 'ansible_shell_executable' from source: unknown 26264 1727204250.39507: variable 'ansible_connection' from source: unknown 26264 1727204250.39572: variable 'ansible_module_compression' from source: unknown 26264 1727204250.39592: variable 'ansible_shell_type' from source: unknown 26264 1727204250.39602: variable 'ansible_shell_executable' from source: unknown 26264 1727204250.39609: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204250.39617: variable 'ansible_pipelining' from source: unknown 26264 1727204250.39623: variable 'ansible_timeout' from source: unknown 26264 1727204250.39652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204250.40090: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204250.40191: variable 'omit' from source: magic vars 26264 1727204250.40210: starting attempt loop 26264 1727204250.40227: running the handler 26264 1727204250.40260: _low_level_execute_command(): starting 26264 1727204250.40350: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204250.43017: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204250.43037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204250.43055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204250.43075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204250.43115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204250.43126: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204250.43137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204250.43175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204250.43187: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204250.43197: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204250.43207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204250.43219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204250.43236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204250.43250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204250.43262: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204250.43279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204250.43373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204250.43401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204250.43417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204250.43490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204250.45057: stdout chunk (state=3): >>>/root <<< 26264 1727204250.45380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204250.46143: stderr chunk (state=3): >>><<< 26264 1727204250.46146: stdout chunk (state=3): >>><<< 26264 1727204250.46294: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204250.46298: _low_level_execute_command(): starting 26264 1727204250.46301: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204250.4618444-27460-19249984457883 `" && echo ansible-tmp-1727204250.4618444-27460-19249984457883="` echo /root/.ansible/tmp/ansible-tmp-1727204250.4618444-27460-19249984457883 `" ) && sleep 0' 26264 1727204250.47988: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204250.48026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204250.48056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204250.48092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204250.48174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204250.48204: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204250.48219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204250.48241: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204250.48257: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204250.48273: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204250.48287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204250.48321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204250.48342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204250.48359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204250.48377: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204250.48391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204250.48579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204250.48596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204250.48627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204250.48767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204250.50523: stdout chunk (state=3): >>>ansible-tmp-1727204250.4618444-27460-19249984457883=/root/.ansible/tmp/ansible-tmp-1727204250.4618444-27460-19249984457883 <<< 26264 1727204250.50718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204250.50722: stdout chunk (state=3): >>><<< 26264 1727204250.50724: stderr chunk (state=3): >>><<< 26264 1727204250.51178: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204250.4618444-27460-19249984457883=/root/.ansible/tmp/ansible-tmp-1727204250.4618444-27460-19249984457883 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204250.51181: variable 'ansible_module_compression' from source: unknown 26264 1727204250.51184: ANSIBALLZ: Using lock for package_facts 26264 1727204250.51186: ANSIBALLZ: Acquiring lock 26264 1727204250.51188: ANSIBALLZ: Lock acquired: 139841024164016 26264 1727204250.51190: ANSIBALLZ: Creating module 26264 1727204251.07328: ANSIBALLZ: Writing module into payload 26264 1727204251.07739: ANSIBALLZ: Writing module 26264 1727204251.07775: ANSIBALLZ: Renaming module 26264 1727204251.07779: ANSIBALLZ: Done creating module 26264 1727204251.07817: variable 'ansible_facts' from source: unknown 26264 1727204251.08238: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204250.4618444-27460-19249984457883/AnsiballZ_package_facts.py 26264 1727204251.08409: Sending initial data 26264 1727204251.08413: Sent initial data (161 bytes) 26264 1727204251.09737: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204251.09747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204251.09758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204251.09773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204251.09921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204251.09928: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204251.09938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204251.09953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204251.09959: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204251.09968: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204251.09976: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204251.09986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204251.09999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204251.10011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204251.10027: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204251.10037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204251.10110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204251.10247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204251.10258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204251.10340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204251.12167: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204251.12200: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204251.12241: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpxm8ra_l2 /root/.ansible/tmp/ansible-tmp-1727204250.4618444-27460-19249984457883/AnsiballZ_package_facts.py <<< 26264 1727204251.12282: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204251.15284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204251.15409: stderr chunk (state=3): >>><<< 26264 1727204251.15413: stdout chunk (state=3): >>><<< 26264 1727204251.15436: done transferring module to remote 26264 1727204251.15450: _low_level_execute_command(): starting 26264 1727204251.15453: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204250.4618444-27460-19249984457883/ /root/.ansible/tmp/ansible-tmp-1727204250.4618444-27460-19249984457883/AnsiballZ_package_facts.py && sleep 0' 26264 1727204251.17024: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204251.17028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204251.17112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204251.17118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204251.17196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204251.17201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204251.17275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204251.17409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204251.17415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204251.17486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204251.19292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204251.19297: stderr chunk (state=3): >>><<< 26264 1727204251.19302: stdout chunk (state=3): >>><<< 26264 1727204251.19332: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204251.19335: _low_level_execute_command(): starting 26264 1727204251.19338: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204250.4618444-27460-19249984457883/AnsiballZ_package_facts.py && sleep 0' 26264 1727204251.20945: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204251.20950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204251.20993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204251.20997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204251.21011: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204251.21014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204251.21027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204251.21032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204251.21098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204251.21112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204251.21118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204251.21202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204251.66989: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gli<<< 26264 1727204251.67042: stdout chunk (state=3): >>>bc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 26264 1727204251.67143: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 26264 1727204251.67320: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 26264 1727204251.67595: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "rel<<< 26264 1727204251.67654: stdout chunk (state=3): >>>ease": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 26264 1727204251.67677: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], <<< 26264 1727204251.67738: stdout chunk (state=3): >>>"slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 26264 1727204251.67746: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 26264 1727204251.67750: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 26264 1727204251.67796: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}],<<< 26264 1727204251.67801: stdout chunk (state=3): >>> "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 26264 1727204251.67834: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 26264 1727204251.68610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204251.68614: stderr chunk (state=3): >>><<< 26264 1727204251.68616: stdout chunk (state=3): >>><<< 26264 1727204251.68987: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204251.73261: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204250.4618444-27460-19249984457883/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204251.73303: _low_level_execute_command(): starting 26264 1727204251.73315: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204250.4618444-27460-19249984457883/ > /dev/null 2>&1 && sleep 0' 26264 1727204251.74014: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204251.76375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204251.76393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204251.76412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204251.76468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204251.76491: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204251.76505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204251.76521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204251.76532: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204251.76541: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204251.76556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204251.76571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204251.76591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204251.76713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204251.76728: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204251.76743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204251.76835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204251.76858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204251.76876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204251.76985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204251.78860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204251.78869: stdout chunk (state=3): >>><<< 26264 1727204251.78873: stderr chunk (state=3): >>><<< 26264 1727204251.79077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204251.79081: handler run complete 26264 1727204251.80478: variable 'ansible_facts' from source: unknown 26264 1727204251.81661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204251.85582: variable 'ansible_facts' from source: unknown 26264 1727204251.99779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204252.01545: attempt loop complete, returning result 26264 1727204252.01651: _execute() done 26264 1727204252.01661: dumping result to json 26264 1727204252.02156: done dumping result, returning 26264 1727204252.02288: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-5ff5-08b0-00000000027f] 26264 1727204252.02298: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000027f 26264 1727204252.06281: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000027f 26264 1727204252.06285: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204252.06403: no more pending results, returning what we have 26264 1727204252.06407: results queue empty 26264 1727204252.06408: checking for any_errors_fatal 26264 1727204252.06411: done checking for any_errors_fatal 26264 1727204252.06412: checking for max_fail_percentage 26264 1727204252.06413: done checking for max_fail_percentage 26264 1727204252.06414: checking to see if all hosts have failed and the running result is not ok 26264 1727204252.06415: done checking to see if all hosts have failed 26264 1727204252.06416: getting the remaining hosts for this loop 26264 1727204252.06418: done getting the remaining hosts for this loop 26264 1727204252.06422: getting the next task for host managed-node3 26264 1727204252.06429: done getting next task for host managed-node3 26264 1727204252.06433: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 26264 1727204252.06435: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204252.06446: getting variables 26264 1727204252.06450: in VariableManager get_vars() 26264 1727204252.06484: Calling all_inventory to load vars for managed-node3 26264 1727204252.06487: Calling groups_inventory to load vars for managed-node3 26264 1727204252.06490: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204252.06499: Calling all_plugins_play to load vars for managed-node3 26264 1727204252.06502: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204252.06505: Calling groups_plugins_play to load vars for managed-node3 26264 1727204252.09412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204252.11414: done with get_vars() 26264 1727204252.11439: done getting variables 26264 1727204252.11510: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:32 -0400 (0:00:01.747) 0:00:15.964 ***** 26264 1727204252.11546: entering _queue_task() for managed-node3/debug 26264 1727204252.11898: worker is 1 (out of 1 available) 26264 1727204252.11924: exiting _queue_task() for managed-node3/debug 26264 1727204252.11937: done queuing things up, now waiting for results queue to drain 26264 1727204252.11938: waiting for pending results... 26264 1727204252.13093: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 26264 1727204252.13218: in run() - task 0affcd87-79f5-5ff5-08b0-00000000001a 26264 1727204252.13354: variable 'ansible_search_path' from source: unknown 26264 1727204252.13378: variable 'ansible_search_path' from source: unknown 26264 1727204252.13483: calling self._execute() 26264 1727204252.13669: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204252.13680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204252.13694: variable 'omit' from source: magic vars 26264 1727204252.14086: variable 'ansible_distribution_major_version' from source: facts 26264 1727204252.14114: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204252.14125: variable 'omit' from source: magic vars 26264 1727204252.14161: variable 'omit' from source: magic vars 26264 1727204252.14270: variable 'network_provider' from source: set_fact 26264 1727204252.14295: variable 'omit' from source: magic vars 26264 1727204252.14352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204252.14397: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204252.14434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204252.14456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204252.14475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204252.14507: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204252.14516: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204252.14528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204252.14649: Set connection var ansible_pipelining to False 26264 1727204252.14659: Set connection var ansible_connection to ssh 26264 1727204252.14669: Set connection var ansible_shell_type to sh 26264 1727204252.14680: Set connection var ansible_shell_executable to /bin/sh 26264 1727204252.14692: Set connection var ansible_timeout to 10 26264 1727204252.14702: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204252.14732: variable 'ansible_shell_executable' from source: unknown 26264 1727204252.14744: variable 'ansible_connection' from source: unknown 26264 1727204252.14758: variable 'ansible_module_compression' from source: unknown 26264 1727204252.14766: variable 'ansible_shell_type' from source: unknown 26264 1727204252.14773: variable 'ansible_shell_executable' from source: unknown 26264 1727204252.14779: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204252.14785: variable 'ansible_pipelining' from source: unknown 26264 1727204252.14790: variable 'ansible_timeout' from source: unknown 26264 1727204252.14796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204252.14940: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204252.14963: variable 'omit' from source: magic vars 26264 1727204252.14981: starting attempt loop 26264 1727204252.14988: running the handler 26264 1727204252.15031: handler run complete 26264 1727204252.15050: attempt loop complete, returning result 26264 1727204252.15056: _execute() done 26264 1727204252.15064: dumping result to json 26264 1727204252.15086: done dumping result, returning 26264 1727204252.15120: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-5ff5-08b0-00000000001a] 26264 1727204252.15130: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000001a ok: [managed-node3] => {} MSG: Using network provider: nm 26264 1727204252.15297: no more pending results, returning what we have 26264 1727204252.15302: results queue empty 26264 1727204252.15303: checking for any_errors_fatal 26264 1727204252.15311: done checking for any_errors_fatal 26264 1727204252.15312: checking for max_fail_percentage 26264 1727204252.15314: done checking for max_fail_percentage 26264 1727204252.15315: checking to see if all hosts have failed and the running result is not ok 26264 1727204252.15317: done checking to see if all hosts have failed 26264 1727204252.15317: getting the remaining hosts for this loop 26264 1727204252.15320: done getting the remaining hosts for this loop 26264 1727204252.15324: getting the next task for host managed-node3 26264 1727204252.15330: done getting next task for host managed-node3 26264 1727204252.15336: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 26264 1727204252.15339: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204252.15349: getting variables 26264 1727204252.15351: in VariableManager get_vars() 26264 1727204252.15391: Calling all_inventory to load vars for managed-node3 26264 1727204252.15395: Calling groups_inventory to load vars for managed-node3 26264 1727204252.15397: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204252.15408: Calling all_plugins_play to load vars for managed-node3 26264 1727204252.15411: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204252.15414: Calling groups_plugins_play to load vars for managed-node3 26264 1727204252.16442: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000001a 26264 1727204252.16446: WORKER PROCESS EXITING 26264 1727204252.17362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204252.21382: done with get_vars() 26264 1727204252.21417: done getting variables 26264 1727204252.21611: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:32 -0400 (0:00:00.100) 0:00:16.065 ***** 26264 1727204252.21646: entering _queue_task() for managed-node3/fail 26264 1727204252.24318: worker is 1 (out of 1 available) 26264 1727204252.24332: exiting _queue_task() for managed-node3/fail 26264 1727204252.24347: done queuing things up, now waiting for results queue to drain 26264 1727204252.24349: waiting for pending results... 26264 1727204252.25269: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 26264 1727204252.25669: in run() - task 0affcd87-79f5-5ff5-08b0-00000000001b 26264 1727204252.25683: variable 'ansible_search_path' from source: unknown 26264 1727204252.25687: variable 'ansible_search_path' from source: unknown 26264 1727204252.25723: calling self._execute() 26264 1727204252.26171: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204252.26175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204252.26179: variable 'omit' from source: magic vars 26264 1727204252.27515: variable 'ansible_distribution_major_version' from source: facts 26264 1727204252.27520: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204252.27869: variable 'network_state' from source: role '' defaults 26264 1727204252.27879: Evaluated conditional (network_state != {}): False 26264 1727204252.27883: when evaluation is False, skipping this task 26264 1727204252.27886: _execute() done 26264 1727204252.27889: dumping result to json 26264 1727204252.27891: done dumping result, returning 26264 1727204252.27901: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-5ff5-08b0-00000000001b] 26264 1727204252.27907: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000001b 26264 1727204252.28222: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000001b 26264 1727204252.28225: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204252.28286: no more pending results, returning what we have 26264 1727204252.28291: results queue empty 26264 1727204252.28292: checking for any_errors_fatal 26264 1727204252.28298: done checking for any_errors_fatal 26264 1727204252.28299: checking for max_fail_percentage 26264 1727204252.28301: done checking for max_fail_percentage 26264 1727204252.28302: checking to see if all hosts have failed and the running result is not ok 26264 1727204252.28303: done checking to see if all hosts have failed 26264 1727204252.28304: getting the remaining hosts for this loop 26264 1727204252.28306: done getting the remaining hosts for this loop 26264 1727204252.28310: getting the next task for host managed-node3 26264 1727204252.28317: done getting next task for host managed-node3 26264 1727204252.28323: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 26264 1727204252.28326: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204252.28342: getting variables 26264 1727204252.28344: in VariableManager get_vars() 26264 1727204252.28389: Calling all_inventory to load vars for managed-node3 26264 1727204252.28393: Calling groups_inventory to load vars for managed-node3 26264 1727204252.28395: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204252.28408: Calling all_plugins_play to load vars for managed-node3 26264 1727204252.28411: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204252.28414: Calling groups_plugins_play to load vars for managed-node3 26264 1727204252.32227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204252.35519: done with get_vars() 26264 1727204252.35560: done getting variables 26264 1727204252.35625: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:32 -0400 (0:00:00.140) 0:00:16.205 ***** 26264 1727204252.35668: entering _queue_task() for managed-node3/fail 26264 1727204252.36054: worker is 1 (out of 1 available) 26264 1727204252.36104: exiting _queue_task() for managed-node3/fail 26264 1727204252.36142: done queuing things up, now waiting for results queue to drain 26264 1727204252.36144: waiting for pending results... 26264 1727204252.37371: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 26264 1727204252.37826: in run() - task 0affcd87-79f5-5ff5-08b0-00000000001c 26264 1727204252.37840: variable 'ansible_search_path' from source: unknown 26264 1727204252.37844: variable 'ansible_search_path' from source: unknown 26264 1727204252.37886: calling self._execute() 26264 1727204252.37984: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204252.37988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204252.37999: variable 'omit' from source: magic vars 26264 1727204252.38845: variable 'ansible_distribution_major_version' from source: facts 26264 1727204252.38861: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204252.39091: variable 'network_state' from source: role '' defaults 26264 1727204252.39231: Evaluated conditional (network_state != {}): False 26264 1727204252.39235: when evaluation is False, skipping this task 26264 1727204252.39238: _execute() done 26264 1727204252.39240: dumping result to json 26264 1727204252.39244: done dumping result, returning 26264 1727204252.39251: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-5ff5-08b0-00000000001c] 26264 1727204252.39259: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000001c 26264 1727204252.39360: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000001c 26264 1727204252.39365: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204252.39418: no more pending results, returning what we have 26264 1727204252.39423: results queue empty 26264 1727204252.39424: checking for any_errors_fatal 26264 1727204252.39431: done checking for any_errors_fatal 26264 1727204252.39432: checking for max_fail_percentage 26264 1727204252.39434: done checking for max_fail_percentage 26264 1727204252.39435: checking to see if all hosts have failed and the running result is not ok 26264 1727204252.39437: done checking to see if all hosts have failed 26264 1727204252.39437: getting the remaining hosts for this loop 26264 1727204252.39439: done getting the remaining hosts for this loop 26264 1727204252.39444: getting the next task for host managed-node3 26264 1727204252.39453: done getting next task for host managed-node3 26264 1727204252.39458: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 26264 1727204252.39460: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204252.39479: getting variables 26264 1727204252.39481: in VariableManager get_vars() 26264 1727204252.39524: Calling all_inventory to load vars for managed-node3 26264 1727204252.39527: Calling groups_inventory to load vars for managed-node3 26264 1727204252.39530: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204252.39543: Calling all_plugins_play to load vars for managed-node3 26264 1727204252.39550: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204252.39554: Calling groups_plugins_play to load vars for managed-node3 26264 1727204252.41482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204252.43940: done with get_vars() 26264 1727204252.43970: done getting variables 26264 1727204252.44029: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:32 -0400 (0:00:00.083) 0:00:16.289 ***** 26264 1727204252.44061: entering _queue_task() for managed-node3/fail 26264 1727204252.44605: worker is 1 (out of 1 available) 26264 1727204252.44618: exiting _queue_task() for managed-node3/fail 26264 1727204252.44631: done queuing things up, now waiting for results queue to drain 26264 1727204252.44632: waiting for pending results... 26264 1727204252.44910: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 26264 1727204252.45012: in run() - task 0affcd87-79f5-5ff5-08b0-00000000001d 26264 1727204252.45027: variable 'ansible_search_path' from source: unknown 26264 1727204252.45030: variable 'ansible_search_path' from source: unknown 26264 1727204252.45071: calling self._execute() 26264 1727204252.45169: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204252.45175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204252.45190: variable 'omit' from source: magic vars 26264 1727204252.45599: variable 'ansible_distribution_major_version' from source: facts 26264 1727204252.45616: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204252.45805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204252.48310: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204252.48753: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204252.48790: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204252.48879: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204252.48905: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204252.49112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204252.49127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204252.49288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204252.49329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204252.49343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204252.49558: variable 'ansible_distribution_major_version' from source: facts 26264 1727204252.49576: Evaluated conditional (ansible_distribution_major_version | int > 9): False 26264 1727204252.49580: when evaluation is False, skipping this task 26264 1727204252.49583: _execute() done 26264 1727204252.49588: dumping result to json 26264 1727204252.49635: done dumping result, returning 26264 1727204252.49643: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-5ff5-08b0-00000000001d] 26264 1727204252.49648: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000001d 26264 1727204252.49751: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000001d 26264 1727204252.49755: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 26264 1727204252.49808: no more pending results, returning what we have 26264 1727204252.49812: results queue empty 26264 1727204252.49813: checking for any_errors_fatal 26264 1727204252.49817: done checking for any_errors_fatal 26264 1727204252.49818: checking for max_fail_percentage 26264 1727204252.49820: done checking for max_fail_percentage 26264 1727204252.49821: checking to see if all hosts have failed and the running result is not ok 26264 1727204252.49822: done checking to see if all hosts have failed 26264 1727204252.49823: getting the remaining hosts for this loop 26264 1727204252.49825: done getting the remaining hosts for this loop 26264 1727204252.49830: getting the next task for host managed-node3 26264 1727204252.49836: done getting next task for host managed-node3 26264 1727204252.49841: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 26264 1727204252.49843: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204252.49860: getting variables 26264 1727204252.49862: in VariableManager get_vars() 26264 1727204252.49905: Calling all_inventory to load vars for managed-node3 26264 1727204252.49908: Calling groups_inventory to load vars for managed-node3 26264 1727204252.49911: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204252.49922: Calling all_plugins_play to load vars for managed-node3 26264 1727204252.49926: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204252.49929: Calling groups_plugins_play to load vars for managed-node3 26264 1727204252.51541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204252.53334: done with get_vars() 26264 1727204252.53365: done getting variables 26264 1727204252.53467: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:32 -0400 (0:00:00.094) 0:00:16.384 ***** 26264 1727204252.53495: entering _queue_task() for managed-node3/dnf 26264 1727204252.54481: worker is 1 (out of 1 available) 26264 1727204252.54494: exiting _queue_task() for managed-node3/dnf 26264 1727204252.54508: done queuing things up, now waiting for results queue to drain 26264 1727204252.54510: waiting for pending results... 26264 1727204252.54885: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 26264 1727204252.54989: in run() - task 0affcd87-79f5-5ff5-08b0-00000000001e 26264 1727204252.55003: variable 'ansible_search_path' from source: unknown 26264 1727204252.55008: variable 'ansible_search_path' from source: unknown 26264 1727204252.55046: calling self._execute() 26264 1727204252.55134: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204252.55138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204252.55149: variable 'omit' from source: magic vars 26264 1727204252.55673: variable 'ansible_distribution_major_version' from source: facts 26264 1727204252.55678: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204252.55776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204252.59675: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204252.59866: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204252.59901: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204252.59933: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204252.60075: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204252.60149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204252.60295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204252.60320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204252.60363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204252.60491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204252.60726: variable 'ansible_distribution' from source: facts 26264 1727204252.60730: variable 'ansible_distribution_major_version' from source: facts 26264 1727204252.60745: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 26264 1727204252.60986: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204252.61237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204252.61375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204252.61400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204252.61439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204252.61457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204252.61608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204252.61631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204252.61658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204252.61811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204252.61824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204252.61870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204252.62077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204252.62081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204252.62084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204252.62086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204252.62487: variable 'network_connections' from source: play vars 26264 1727204252.62500: variable 'interface' from source: set_fact 26264 1727204252.62689: variable 'interface' from source: set_fact 26264 1727204252.62697: variable 'interface' from source: set_fact 26264 1727204252.62759: variable 'interface' from source: set_fact 26264 1727204252.62935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204252.63332: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204252.63376: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204252.63541: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204252.63578: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204252.63625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204252.63774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204252.63798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204252.63822: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204252.64002: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204252.64618: variable 'network_connections' from source: play vars 26264 1727204252.64625: variable 'interface' from source: set_fact 26264 1727204252.64697: variable 'interface' from source: set_fact 26264 1727204252.64704: variable 'interface' from source: set_fact 26264 1727204252.64890: variable 'interface' from source: set_fact 26264 1727204252.64926: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26264 1727204252.64930: when evaluation is False, skipping this task 26264 1727204252.64932: _execute() done 26264 1727204252.64935: dumping result to json 26264 1727204252.65056: done dumping result, returning 26264 1727204252.65068: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-5ff5-08b0-00000000001e] 26264 1727204252.65073: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000001e skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26264 1727204252.65231: no more pending results, returning what we have 26264 1727204252.65236: results queue empty 26264 1727204252.65237: checking for any_errors_fatal 26264 1727204252.65243: done checking for any_errors_fatal 26264 1727204252.65243: checking for max_fail_percentage 26264 1727204252.65245: done checking for max_fail_percentage 26264 1727204252.65246: checking to see if all hosts have failed and the running result is not ok 26264 1727204252.65250: done checking to see if all hosts have failed 26264 1727204252.65251: getting the remaining hosts for this loop 26264 1727204252.65253: done getting the remaining hosts for this loop 26264 1727204252.65257: getting the next task for host managed-node3 26264 1727204252.65266: done getting next task for host managed-node3 26264 1727204252.65272: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 26264 1727204252.65274: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204252.65285: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000001e 26264 1727204252.65289: WORKER PROCESS EXITING 26264 1727204252.65298: getting variables 26264 1727204252.65300: in VariableManager get_vars() 26264 1727204252.65339: Calling all_inventory to load vars for managed-node3 26264 1727204252.65343: Calling groups_inventory to load vars for managed-node3 26264 1727204252.65345: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204252.65360: Calling all_plugins_play to load vars for managed-node3 26264 1727204252.65363: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204252.65368: Calling groups_plugins_play to load vars for managed-node3 26264 1727204252.68509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204252.71342: done with get_vars() 26264 1727204252.72085: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 26264 1727204252.72175: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:32 -0400 (0:00:00.187) 0:00:16.571 ***** 26264 1727204252.72209: entering _queue_task() for managed-node3/yum 26264 1727204252.72211: Creating lock for yum 26264 1727204252.72531: worker is 1 (out of 1 available) 26264 1727204252.72542: exiting _queue_task() for managed-node3/yum 26264 1727204252.72558: done queuing things up, now waiting for results queue to drain 26264 1727204252.72560: waiting for pending results... 26264 1727204252.73545: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 26264 1727204252.73681: in run() - task 0affcd87-79f5-5ff5-08b0-00000000001f 26264 1727204252.73787: variable 'ansible_search_path' from source: unknown 26264 1727204252.73815: variable 'ansible_search_path' from source: unknown 26264 1727204252.73961: calling self._execute() 26264 1727204252.74081: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204252.74143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204252.74217: variable 'omit' from source: magic vars 26264 1727204252.75080: variable 'ansible_distribution_major_version' from source: facts 26264 1727204252.75137: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204252.75527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204252.82278: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204252.82412: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204252.82450: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204252.82604: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204252.82632: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204252.82826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204252.82861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204252.82885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204252.83040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204252.83058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204252.83274: variable 'ansible_distribution_major_version' from source: facts 26264 1727204252.83291: Evaluated conditional (ansible_distribution_major_version | int < 8): False 26264 1727204252.83297: when evaluation is False, skipping this task 26264 1727204252.83299: _execute() done 26264 1727204252.83302: dumping result to json 26264 1727204252.83304: done dumping result, returning 26264 1727204252.83306: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-5ff5-08b0-00000000001f] 26264 1727204252.83309: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000001f skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 26264 1727204252.83502: no more pending results, returning what we have 26264 1727204252.83506: results queue empty 26264 1727204252.83507: checking for any_errors_fatal 26264 1727204252.83512: done checking for any_errors_fatal 26264 1727204252.83512: checking for max_fail_percentage 26264 1727204252.83514: done checking for max_fail_percentage 26264 1727204252.83515: checking to see if all hosts have failed and the running result is not ok 26264 1727204252.83516: done checking to see if all hosts have failed 26264 1727204252.83517: getting the remaining hosts for this loop 26264 1727204252.83518: done getting the remaining hosts for this loop 26264 1727204252.83523: getting the next task for host managed-node3 26264 1727204252.83529: done getting next task for host managed-node3 26264 1727204252.83535: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 26264 1727204252.83537: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204252.83556: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000001f 26264 1727204252.83560: WORKER PROCESS EXITING 26264 1727204252.83568: getting variables 26264 1727204252.83571: in VariableManager get_vars() 26264 1727204252.83610: Calling all_inventory to load vars for managed-node3 26264 1727204252.83614: Calling groups_inventory to load vars for managed-node3 26264 1727204252.83616: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204252.83628: Calling all_plugins_play to load vars for managed-node3 26264 1727204252.83631: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204252.83634: Calling groups_plugins_play to load vars for managed-node3 26264 1727204252.86228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204252.91145: done with get_vars() 26264 1727204252.91190: done getting variables 26264 1727204252.91261: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:32 -0400 (0:00:00.194) 0:00:16.765 ***** 26264 1727204252.91637: entering _queue_task() for managed-node3/fail 26264 1727204252.92011: worker is 1 (out of 1 available) 26264 1727204252.92025: exiting _queue_task() for managed-node3/fail 26264 1727204252.92038: done queuing things up, now waiting for results queue to drain 26264 1727204252.92040: waiting for pending results... 26264 1727204252.92423: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 26264 1727204252.92525: in run() - task 0affcd87-79f5-5ff5-08b0-000000000020 26264 1727204252.92538: variable 'ansible_search_path' from source: unknown 26264 1727204252.92542: variable 'ansible_search_path' from source: unknown 26264 1727204252.92584: calling self._execute() 26264 1727204252.92681: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204252.92684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204252.92697: variable 'omit' from source: magic vars 26264 1727204252.93094: variable 'ansible_distribution_major_version' from source: facts 26264 1727204252.93108: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204252.93232: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204252.93437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204252.97227: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204252.97308: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204252.97347: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204252.97385: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204252.97413: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204252.97502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204252.97532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204252.97563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204252.97607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204252.97622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204252.97671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204252.97699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204252.97723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204252.97768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204252.97807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204252.97826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204252.97849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204252.97884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204252.97916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204252.97930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204252.98130: variable 'network_connections' from source: play vars 26264 1727204252.98140: variable 'interface' from source: set_fact 26264 1727204252.98226: variable 'interface' from source: set_fact 26264 1727204252.98241: variable 'interface' from source: set_fact 26264 1727204252.98300: variable 'interface' from source: set_fact 26264 1727204252.98377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204252.98622: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204252.98661: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204252.98709: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204252.98736: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204252.98788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204252.98808: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204252.98833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204252.98869: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204252.98926: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204252.99205: variable 'network_connections' from source: play vars 26264 1727204252.99215: variable 'interface' from source: set_fact 26264 1727204252.99281: variable 'interface' from source: set_fact 26264 1727204252.99288: variable 'interface' from source: set_fact 26264 1727204252.99351: variable 'interface' from source: set_fact 26264 1727204252.99389: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26264 1727204252.99392: when evaluation is False, skipping this task 26264 1727204252.99395: _execute() done 26264 1727204252.99397: dumping result to json 26264 1727204252.99399: done dumping result, returning 26264 1727204252.99408: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-5ff5-08b0-000000000020] 26264 1727204252.99420: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000020 26264 1727204252.99520: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000020 26264 1727204252.99523: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26264 1727204252.99590: no more pending results, returning what we have 26264 1727204252.99595: results queue empty 26264 1727204252.99596: checking for any_errors_fatal 26264 1727204252.99601: done checking for any_errors_fatal 26264 1727204252.99602: checking for max_fail_percentage 26264 1727204252.99604: done checking for max_fail_percentage 26264 1727204252.99605: checking to see if all hosts have failed and the running result is not ok 26264 1727204252.99606: done checking to see if all hosts have failed 26264 1727204252.99607: getting the remaining hosts for this loop 26264 1727204252.99609: done getting the remaining hosts for this loop 26264 1727204252.99614: getting the next task for host managed-node3 26264 1727204252.99620: done getting next task for host managed-node3 26264 1727204252.99625: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 26264 1727204252.99628: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204252.99642: getting variables 26264 1727204252.99644: in VariableManager get_vars() 26264 1727204252.99694: Calling all_inventory to load vars for managed-node3 26264 1727204252.99698: Calling groups_inventory to load vars for managed-node3 26264 1727204252.99701: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204252.99715: Calling all_plugins_play to load vars for managed-node3 26264 1727204252.99718: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204252.99722: Calling groups_plugins_play to load vars for managed-node3 26264 1727204253.01379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204253.09028: done with get_vars() 26264 1727204253.09082: done getting variables 26264 1727204253.09156: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.176) 0:00:16.942 ***** 26264 1727204253.09344: entering _queue_task() for managed-node3/package 26264 1727204253.10228: worker is 1 (out of 1 available) 26264 1727204253.10266: exiting _queue_task() for managed-node3/package 26264 1727204253.10279: done queuing things up, now waiting for results queue to drain 26264 1727204253.10281: waiting for pending results... 26264 1727204253.11365: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 26264 1727204253.11626: in run() - task 0affcd87-79f5-5ff5-08b0-000000000021 26264 1727204253.11637: variable 'ansible_search_path' from source: unknown 26264 1727204253.11640: variable 'ansible_search_path' from source: unknown 26264 1727204253.11735: calling self._execute() 26264 1727204253.11839: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204253.11843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204253.11857: variable 'omit' from source: magic vars 26264 1727204253.12290: variable 'ansible_distribution_major_version' from source: facts 26264 1727204253.12305: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204253.12511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204253.12792: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204253.12833: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204253.12924: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204253.12963: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204253.13116: variable 'network_packages' from source: role '' defaults 26264 1727204253.13231: variable '__network_provider_setup' from source: role '' defaults 26264 1727204253.13242: variable '__network_service_name_default_nm' from source: role '' defaults 26264 1727204253.13314: variable '__network_service_name_default_nm' from source: role '' defaults 26264 1727204253.13329: variable '__network_packages_default_nm' from source: role '' defaults 26264 1727204253.13392: variable '__network_packages_default_nm' from source: role '' defaults 26264 1727204253.13578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204253.15728: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204253.15801: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204253.15859: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204253.15899: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204253.15933: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204253.16016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204253.16057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204253.16335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.16466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204253.16580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204253.16845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204253.16876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204253.17023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.17189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204253.17211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204253.17810: variable '__network_packages_default_gobject_packages' from source: role '' defaults 26264 1727204253.18153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204253.18357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204253.18391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.18738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204253.18798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204253.19281: variable 'ansible_python' from source: facts 26264 1727204253.19313: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 26264 1727204253.19443: variable '__network_wpa_supplicant_required' from source: role '' defaults 26264 1727204253.19536: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26264 1727204253.20036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204253.20136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204253.20175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.20219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204253.20236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204253.20293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204253.20328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204253.20358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.20699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204253.20724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204253.21100: variable 'network_connections' from source: play vars 26264 1727204253.21112: variable 'interface' from source: set_fact 26264 1727204253.21259: variable 'interface' from source: set_fact 26264 1727204253.21282: variable 'interface' from source: set_fact 26264 1727204253.21418: variable 'interface' from source: set_fact 26264 1727204253.21503: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204253.21535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204253.21573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.21612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204253.21666: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204253.21970: variable 'network_connections' from source: play vars 26264 1727204253.21980: variable 'interface' from source: set_fact 26264 1727204253.22087: variable 'interface' from source: set_fact 26264 1727204253.22102: variable 'interface' from source: set_fact 26264 1727204253.22242: variable 'interface' from source: set_fact 26264 1727204253.22308: variable '__network_packages_default_wireless' from source: role '' defaults 26264 1727204253.22398: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204253.22717: variable 'network_connections' from source: play vars 26264 1727204253.22726: variable 'interface' from source: set_fact 26264 1727204253.22786: variable 'interface' from source: set_fact 26264 1727204253.22800: variable 'interface' from source: set_fact 26264 1727204253.22974: variable 'interface' from source: set_fact 26264 1727204253.23043: variable '__network_packages_default_team' from source: role '' defaults 26264 1727204253.23291: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204253.23814: variable 'network_connections' from source: play vars 26264 1727204253.23944: variable 'interface' from source: set_fact 26264 1727204253.24014: variable 'interface' from source: set_fact 26264 1727204253.24031: variable 'interface' from source: set_fact 26264 1727204253.24103: variable 'interface' from source: set_fact 26264 1727204253.24182: variable '__network_service_name_default_initscripts' from source: role '' defaults 26264 1727204253.24252: variable '__network_service_name_default_initscripts' from source: role '' defaults 26264 1727204253.24267: variable '__network_packages_default_initscripts' from source: role '' defaults 26264 1727204253.24329: variable '__network_packages_default_initscripts' from source: role '' defaults 26264 1727204253.24622: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 26264 1727204253.25131: variable 'network_connections' from source: play vars 26264 1727204253.25144: variable 'interface' from source: set_fact 26264 1727204253.25245: variable 'interface' from source: set_fact 26264 1727204253.25266: variable 'interface' from source: set_fact 26264 1727204253.25367: variable 'interface' from source: set_fact 26264 1727204253.25445: variable 'ansible_distribution' from source: facts 26264 1727204253.25492: variable '__network_rh_distros' from source: role '' defaults 26264 1727204253.25505: variable 'ansible_distribution_major_version' from source: facts 26264 1727204253.25599: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 26264 1727204253.26053: variable 'ansible_distribution' from source: facts 26264 1727204253.26063: variable '__network_rh_distros' from source: role '' defaults 26264 1727204253.26076: variable 'ansible_distribution_major_version' from source: facts 26264 1727204253.26094: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 26264 1727204253.26265: variable 'ansible_distribution' from source: facts 26264 1727204253.26275: variable '__network_rh_distros' from source: role '' defaults 26264 1727204253.26283: variable 'ansible_distribution_major_version' from source: facts 26264 1727204253.26354: variable 'network_provider' from source: set_fact 26264 1727204253.26376: variable 'ansible_facts' from source: unknown 26264 1727204253.27733: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 26264 1727204253.27743: when evaluation is False, skipping this task 26264 1727204253.27754: _execute() done 26264 1727204253.27762: dumping result to json 26264 1727204253.27773: done dumping result, returning 26264 1727204253.27877: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-5ff5-08b0-000000000021] 26264 1727204253.27906: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000021 skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 26264 1727204253.28187: no more pending results, returning what we have 26264 1727204253.28192: results queue empty 26264 1727204253.28193: checking for any_errors_fatal 26264 1727204253.28199: done checking for any_errors_fatal 26264 1727204253.28200: checking for max_fail_percentage 26264 1727204253.28202: done checking for max_fail_percentage 26264 1727204253.28203: checking to see if all hosts have failed and the running result is not ok 26264 1727204253.28204: done checking to see if all hosts have failed 26264 1727204253.28204: getting the remaining hosts for this loop 26264 1727204253.28206: done getting the remaining hosts for this loop 26264 1727204253.28210: getting the next task for host managed-node3 26264 1727204253.28217: done getting next task for host managed-node3 26264 1727204253.28221: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 26264 1727204253.28223: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204253.28271: getting variables 26264 1727204253.28278: in VariableManager get_vars() 26264 1727204253.28440: Calling all_inventory to load vars for managed-node3 26264 1727204253.28444: Calling groups_inventory to load vars for managed-node3 26264 1727204253.28450: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204253.28462: Calling all_plugins_play to load vars for managed-node3 26264 1727204253.28534: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204253.28572: Calling groups_plugins_play to load vars for managed-node3 26264 1727204253.30489: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000021 26264 1727204253.30493: WORKER PROCESS EXITING 26264 1727204253.30734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204253.32501: done with get_vars() 26264 1727204253.32525: done getting variables 26264 1727204253.32586: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.233) 0:00:17.175 ***** 26264 1727204253.32619: entering _queue_task() for managed-node3/package 26264 1727204253.34467: worker is 1 (out of 1 available) 26264 1727204253.34480: exiting _queue_task() for managed-node3/package 26264 1727204253.34494: done queuing things up, now waiting for results queue to drain 26264 1727204253.34496: waiting for pending results... 26264 1727204253.35435: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 26264 1727204253.35712: in run() - task 0affcd87-79f5-5ff5-08b0-000000000022 26264 1727204253.35733: variable 'ansible_search_path' from source: unknown 26264 1727204253.35742: variable 'ansible_search_path' from source: unknown 26264 1727204253.35788: calling self._execute() 26264 1727204253.35889: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204253.35903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204253.35987: variable 'omit' from source: magic vars 26264 1727204253.36486: variable 'ansible_distribution_major_version' from source: facts 26264 1727204253.36866: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204253.37065: variable 'network_state' from source: role '' defaults 26264 1727204253.37146: Evaluated conditional (network_state != {}): False 26264 1727204253.37158: when evaluation is False, skipping this task 26264 1727204253.37168: _execute() done 26264 1727204253.37176: dumping result to json 26264 1727204253.37184: done dumping result, returning 26264 1727204253.37200: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-5ff5-08b0-000000000022] 26264 1727204253.37217: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000022 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204253.37392: no more pending results, returning what we have 26264 1727204253.37396: results queue empty 26264 1727204253.37398: checking for any_errors_fatal 26264 1727204253.37405: done checking for any_errors_fatal 26264 1727204253.37406: checking for max_fail_percentage 26264 1727204253.37408: done checking for max_fail_percentage 26264 1727204253.37409: checking to see if all hosts have failed and the running result is not ok 26264 1727204253.37410: done checking to see if all hosts have failed 26264 1727204253.37411: getting the remaining hosts for this loop 26264 1727204253.37413: done getting the remaining hosts for this loop 26264 1727204253.37417: getting the next task for host managed-node3 26264 1727204253.37425: done getting next task for host managed-node3 26264 1727204253.37429: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 26264 1727204253.37431: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204253.37452: getting variables 26264 1727204253.37454: in VariableManager get_vars() 26264 1727204253.37497: Calling all_inventory to load vars for managed-node3 26264 1727204253.37500: Calling groups_inventory to load vars for managed-node3 26264 1727204253.37503: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204253.37516: Calling all_plugins_play to load vars for managed-node3 26264 1727204253.37520: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204253.37523: Calling groups_plugins_play to load vars for managed-node3 26264 1727204253.38718: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000022 26264 1727204253.38721: WORKER PROCESS EXITING 26264 1727204253.40445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204253.43419: done with get_vars() 26264 1727204253.43543: done getting variables 26264 1727204253.44139: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.115) 0:00:17.290 ***** 26264 1727204253.44184: entering _queue_task() for managed-node3/package 26264 1727204253.44544: worker is 1 (out of 1 available) 26264 1727204253.44569: exiting _queue_task() for managed-node3/package 26264 1727204253.44582: done queuing things up, now waiting for results queue to drain 26264 1727204253.44584: waiting for pending results... 26264 1727204253.45277: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 26264 1727204253.45727: in run() - task 0affcd87-79f5-5ff5-08b0-000000000023 26264 1727204253.45779: variable 'ansible_search_path' from source: unknown 26264 1727204253.45811: variable 'ansible_search_path' from source: unknown 26264 1727204253.45868: calling self._execute() 26264 1727204253.46004: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204253.46014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204253.46032: variable 'omit' from source: magic vars 26264 1727204253.46970: variable 'ansible_distribution_major_version' from source: facts 26264 1727204253.47116: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204253.50366: variable 'network_state' from source: role '' defaults 26264 1727204253.50506: Evaluated conditional (network_state != {}): False 26264 1727204253.50536: when evaluation is False, skipping this task 26264 1727204253.50602: _execute() done 26264 1727204253.50629: dumping result to json 26264 1727204253.50652: done dumping result, returning 26264 1727204253.50701: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-5ff5-08b0-000000000023] 26264 1727204253.50714: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000023 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204253.51032: no more pending results, returning what we have 26264 1727204253.51039: results queue empty 26264 1727204253.51042: checking for any_errors_fatal 26264 1727204253.51081: done checking for any_errors_fatal 26264 1727204253.51082: checking for max_fail_percentage 26264 1727204253.51084: done checking for max_fail_percentage 26264 1727204253.51085: checking to see if all hosts have failed and the running result is not ok 26264 1727204253.51087: done checking to see if all hosts have failed 26264 1727204253.51087: getting the remaining hosts for this loop 26264 1727204253.51090: done getting the remaining hosts for this loop 26264 1727204253.51108: getting the next task for host managed-node3 26264 1727204253.51116: done getting next task for host managed-node3 26264 1727204253.51120: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 26264 1727204253.51142: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204253.51162: getting variables 26264 1727204253.51165: in VariableManager get_vars() 26264 1727204253.51210: Calling all_inventory to load vars for managed-node3 26264 1727204253.51214: Calling groups_inventory to load vars for managed-node3 26264 1727204253.51216: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204253.51230: Calling all_plugins_play to load vars for managed-node3 26264 1727204253.51234: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204253.51237: Calling groups_plugins_play to load vars for managed-node3 26264 1727204253.52437: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000023 26264 1727204253.52441: WORKER PROCESS EXITING 26264 1727204253.57538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204253.59629: done with get_vars() 26264 1727204253.59672: done getting variables 26264 1727204253.59795: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.156) 0:00:17.447 ***** 26264 1727204253.59827: entering _queue_task() for managed-node3/service 26264 1727204253.59829: Creating lock for service 26264 1727204253.60204: worker is 1 (out of 1 available) 26264 1727204253.60216: exiting _queue_task() for managed-node3/service 26264 1727204253.60230: done queuing things up, now waiting for results queue to drain 26264 1727204253.60232: waiting for pending results... 26264 1727204253.60538: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 26264 1727204253.60675: in run() - task 0affcd87-79f5-5ff5-08b0-000000000024 26264 1727204253.60697: variable 'ansible_search_path' from source: unknown 26264 1727204253.60704: variable 'ansible_search_path' from source: unknown 26264 1727204253.60756: calling self._execute() 26264 1727204253.60870: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204253.60882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204253.60901: variable 'omit' from source: magic vars 26264 1727204253.61326: variable 'ansible_distribution_major_version' from source: facts 26264 1727204253.61342: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204253.61476: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204253.61698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204253.65457: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204253.65538: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204253.65598: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204253.65642: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204253.65677: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204253.65780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204253.65817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204253.65860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.65916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204253.65937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204253.66005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204253.66036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204253.66080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.66129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204253.66148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204253.66204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204253.66235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204253.66270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.66324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204253.66345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204253.66550: variable 'network_connections' from source: play vars 26264 1727204253.66570: variable 'interface' from source: set_fact 26264 1727204253.66661: variable 'interface' from source: set_fact 26264 1727204253.66678: variable 'interface' from source: set_fact 26264 1727204253.66766: variable 'interface' from source: set_fact 26264 1727204253.66861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204253.67056: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204253.67101: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204253.67139: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204253.67195: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204253.67245: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204253.67283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204253.67316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.67344: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204253.67425: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204253.67711: variable 'network_connections' from source: play vars 26264 1727204253.67726: variable 'interface' from source: set_fact 26264 1727204253.67794: variable 'interface' from source: set_fact 26264 1727204253.67810: variable 'interface' from source: set_fact 26264 1727204253.67878: variable 'interface' from source: set_fact 26264 1727204253.67921: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26264 1727204253.67931: when evaluation is False, skipping this task 26264 1727204253.67938: _execute() done 26264 1727204253.67950: dumping result to json 26264 1727204253.67959: done dumping result, returning 26264 1727204253.67972: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-5ff5-08b0-000000000024] 26264 1727204253.67993: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000024 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26264 1727204253.68169: no more pending results, returning what we have 26264 1727204253.68174: results queue empty 26264 1727204253.68176: checking for any_errors_fatal 26264 1727204253.68187: done checking for any_errors_fatal 26264 1727204253.68188: checking for max_fail_percentage 26264 1727204253.68190: done checking for max_fail_percentage 26264 1727204253.68191: checking to see if all hosts have failed and the running result is not ok 26264 1727204253.68192: done checking to see if all hosts have failed 26264 1727204253.68193: getting the remaining hosts for this loop 26264 1727204253.68195: done getting the remaining hosts for this loop 26264 1727204253.68200: getting the next task for host managed-node3 26264 1727204253.68211: done getting next task for host managed-node3 26264 1727204253.68215: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 26264 1727204253.68217: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204253.68233: getting variables 26264 1727204253.68235: in VariableManager get_vars() 26264 1727204253.68279: Calling all_inventory to load vars for managed-node3 26264 1727204253.68283: Calling groups_inventory to load vars for managed-node3 26264 1727204253.68286: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204253.68298: Calling all_plugins_play to load vars for managed-node3 26264 1727204253.68301: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204253.68304: Calling groups_plugins_play to load vars for managed-node3 26264 1727204253.69472: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000024 26264 1727204253.69476: WORKER PROCESS EXITING 26264 1727204253.70645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204253.72501: done with get_vars() 26264 1727204253.72531: done getting variables 26264 1727204253.72605: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.128) 0:00:17.575 ***** 26264 1727204253.72635: entering _queue_task() for managed-node3/service 26264 1727204253.73224: worker is 1 (out of 1 available) 26264 1727204253.73238: exiting _queue_task() for managed-node3/service 26264 1727204253.73253: done queuing things up, now waiting for results queue to drain 26264 1727204253.73255: waiting for pending results... 26264 1727204253.74035: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 26264 1727204253.74142: in run() - task 0affcd87-79f5-5ff5-08b0-000000000025 26264 1727204253.74156: variable 'ansible_search_path' from source: unknown 26264 1727204253.74160: variable 'ansible_search_path' from source: unknown 26264 1727204253.74205: calling self._execute() 26264 1727204253.74294: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204253.74304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204253.74315: variable 'omit' from source: magic vars 26264 1727204253.74708: variable 'ansible_distribution_major_version' from source: facts 26264 1727204253.74719: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204253.74929: variable 'network_provider' from source: set_fact 26264 1727204253.74933: variable 'network_state' from source: role '' defaults 26264 1727204253.74955: Evaluated conditional (network_provider == "nm" or network_state != {}): True 26264 1727204253.74965: variable 'omit' from source: magic vars 26264 1727204253.75013: variable 'omit' from source: magic vars 26264 1727204253.75044: variable 'network_service_name' from source: role '' defaults 26264 1727204253.75140: variable 'network_service_name' from source: role '' defaults 26264 1727204253.75254: variable '__network_provider_setup' from source: role '' defaults 26264 1727204253.75258: variable '__network_service_name_default_nm' from source: role '' defaults 26264 1727204253.75326: variable '__network_service_name_default_nm' from source: role '' defaults 26264 1727204253.75334: variable '__network_packages_default_nm' from source: role '' defaults 26264 1727204253.75400: variable '__network_packages_default_nm' from source: role '' defaults 26264 1727204253.75639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204253.78044: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204253.78130: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204253.78167: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204253.78202: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204253.78234: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204253.78312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204253.78350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204253.78377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.78435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204253.78453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204253.78497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204253.78536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204253.78561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.78611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204253.78625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204253.78886: variable '__network_packages_default_gobject_packages' from source: role '' defaults 26264 1727204253.79031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204253.79062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204253.79101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.79146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204253.79170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204253.79266: variable 'ansible_python' from source: facts 26264 1727204253.79289: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 26264 1727204253.79501: variable '__network_wpa_supplicant_required' from source: role '' defaults 26264 1727204253.79868: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26264 1727204253.79872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204253.79875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204253.79877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.79879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204253.79881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204253.79883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204253.79897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204253.79920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.79966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204253.79981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204253.80123: variable 'network_connections' from source: play vars 26264 1727204253.80130: variable 'interface' from source: set_fact 26264 1727204253.80212: variable 'interface' from source: set_fact 26264 1727204253.80223: variable 'interface' from source: set_fact 26264 1727204253.80290: variable 'interface' from source: set_fact 26264 1727204253.80402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204253.80599: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204253.80654: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204253.80696: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204253.80742: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204253.80797: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204253.80828: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204253.80858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204253.80890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204253.80941: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204253.81244: variable 'network_connections' from source: play vars 26264 1727204253.81253: variable 'interface' from source: set_fact 26264 1727204253.81356: variable 'interface' from source: set_fact 26264 1727204253.81373: variable 'interface' from source: set_fact 26264 1727204253.81461: variable 'interface' from source: set_fact 26264 1727204253.81531: variable '__network_packages_default_wireless' from source: role '' defaults 26264 1727204253.81635: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204253.81951: variable 'network_connections' from source: play vars 26264 1727204253.81954: variable 'interface' from source: set_fact 26264 1727204253.82030: variable 'interface' from source: set_fact 26264 1727204253.82037: variable 'interface' from source: set_fact 26264 1727204253.82105: variable 'interface' from source: set_fact 26264 1727204253.82132: variable '__network_packages_default_team' from source: role '' defaults 26264 1727204253.82223: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204253.82530: variable 'network_connections' from source: play vars 26264 1727204253.82535: variable 'interface' from source: set_fact 26264 1727204253.82613: variable 'interface' from source: set_fact 26264 1727204253.82620: variable 'interface' from source: set_fact 26264 1727204253.82695: variable 'interface' from source: set_fact 26264 1727204253.82755: variable '__network_service_name_default_initscripts' from source: role '' defaults 26264 1727204253.82823: variable '__network_service_name_default_initscripts' from source: role '' defaults 26264 1727204253.82831: variable '__network_packages_default_initscripts' from source: role '' defaults 26264 1727204253.82892: variable '__network_packages_default_initscripts' from source: role '' defaults 26264 1727204253.83123: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 26264 1727204253.83680: variable 'network_connections' from source: play vars 26264 1727204253.83683: variable 'interface' from source: set_fact 26264 1727204253.83743: variable 'interface' from source: set_fact 26264 1727204253.83746: variable 'interface' from source: set_fact 26264 1727204253.83813: variable 'interface' from source: set_fact 26264 1727204253.83822: variable 'ansible_distribution' from source: facts 26264 1727204253.83825: variable '__network_rh_distros' from source: role '' defaults 26264 1727204253.83830: variable 'ansible_distribution_major_version' from source: facts 26264 1727204253.83853: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 26264 1727204253.84036: variable 'ansible_distribution' from source: facts 26264 1727204253.84039: variable '__network_rh_distros' from source: role '' defaults 26264 1727204253.84042: variable 'ansible_distribution_major_version' from source: facts 26264 1727204253.84056: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 26264 1727204253.84243: variable 'ansible_distribution' from source: facts 26264 1727204253.84246: variable '__network_rh_distros' from source: role '' defaults 26264 1727204253.84253: variable 'ansible_distribution_major_version' from source: facts 26264 1727204253.84291: variable 'network_provider' from source: set_fact 26264 1727204253.84321: variable 'omit' from source: magic vars 26264 1727204253.84350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204253.84379: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204253.84397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204253.84418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204253.84432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204253.84461: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204253.84465: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204253.84468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204253.84567: Set connection var ansible_pipelining to False 26264 1727204253.84571: Set connection var ansible_connection to ssh 26264 1727204253.84573: Set connection var ansible_shell_type to sh 26264 1727204253.84579: Set connection var ansible_shell_executable to /bin/sh 26264 1727204253.84587: Set connection var ansible_timeout to 10 26264 1727204253.84593: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204253.84618: variable 'ansible_shell_executable' from source: unknown 26264 1727204253.84621: variable 'ansible_connection' from source: unknown 26264 1727204253.84624: variable 'ansible_module_compression' from source: unknown 26264 1727204253.84630: variable 'ansible_shell_type' from source: unknown 26264 1727204253.84632: variable 'ansible_shell_executable' from source: unknown 26264 1727204253.84634: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204253.84643: variable 'ansible_pipelining' from source: unknown 26264 1727204253.84646: variable 'ansible_timeout' from source: unknown 26264 1727204253.84652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204253.84765: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204253.84775: variable 'omit' from source: magic vars 26264 1727204253.84781: starting attempt loop 26264 1727204253.84784: running the handler 26264 1727204253.84868: variable 'ansible_facts' from source: unknown 26264 1727204253.85769: _low_level_execute_command(): starting 26264 1727204253.85777: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204253.86714: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204253.86725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204253.86736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204253.86752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204253.86793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204253.86800: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204253.86817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204253.86831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204253.86838: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204253.86845: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204253.86853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204253.86862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204253.86875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204253.86882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204253.86889: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204253.86898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204253.86976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204253.86994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204253.87007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204253.87081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204253.88707: stdout chunk (state=3): >>>/root <<< 26264 1727204253.88889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204253.88893: stdout chunk (state=3): >>><<< 26264 1727204253.88904: stderr chunk (state=3): >>><<< 26264 1727204253.88923: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204253.88935: _low_level_execute_command(): starting 26264 1727204253.88941: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204253.8892336-27593-19234131038885 `" && echo ansible-tmp-1727204253.8892336-27593-19234131038885="` echo /root/.ansible/tmp/ansible-tmp-1727204253.8892336-27593-19234131038885 `" ) && sleep 0' 26264 1727204253.89573: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204253.89583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204253.89596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204253.89610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204253.89653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204253.90085: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204253.90095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204253.90108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204253.90114: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204253.90121: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204253.90128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204253.90136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204253.90154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204253.90157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204253.90160: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204253.90170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204253.90240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204253.90268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204253.90271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204253.90336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204253.92279: stdout chunk (state=3): >>>ansible-tmp-1727204253.8892336-27593-19234131038885=/root/.ansible/tmp/ansible-tmp-1727204253.8892336-27593-19234131038885 <<< 26264 1727204253.92333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204253.92336: stdout chunk (state=3): >>><<< 26264 1727204253.92343: stderr chunk (state=3): >>><<< 26264 1727204253.92361: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204253.8892336-27593-19234131038885=/root/.ansible/tmp/ansible-tmp-1727204253.8892336-27593-19234131038885 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204253.92395: variable 'ansible_module_compression' from source: unknown 26264 1727204253.92458: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 26264 1727204253.92463: ANSIBALLZ: Acquiring lock 26264 1727204253.92469: ANSIBALLZ: Lock acquired: 139841028923536 26264 1727204253.92471: ANSIBALLZ: Creating module 26264 1727204254.39272: ANSIBALLZ: Writing module into payload 26264 1727204254.39493: ANSIBALLZ: Writing module 26264 1727204254.40226: ANSIBALLZ: Renaming module 26264 1727204254.40239: ANSIBALLZ: Done creating module 26264 1727204254.40284: variable 'ansible_facts' from source: unknown 26264 1727204254.40656: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204253.8892336-27593-19234131038885/AnsiballZ_systemd.py 26264 1727204254.40829: Sending initial data 26264 1727204254.40833: Sent initial data (155 bytes) 26264 1727204254.41803: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204254.41826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204254.41843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204254.41866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204254.41913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204254.41926: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204254.41945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204254.41966: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204254.41980: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204254.41991: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204254.42004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204254.42018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204254.42038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204254.42052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204254.42066: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204254.42081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204254.42159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204254.42187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204254.42205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204254.42291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204254.44095: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204254.44125: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204254.44188: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpxupzkkdf /root/.ansible/tmp/ansible-tmp-1727204253.8892336-27593-19234131038885/AnsiballZ_systemd.py <<< 26264 1727204254.44202: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204254.47184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204254.47389: stderr chunk (state=3): >>><<< 26264 1727204254.47393: stdout chunk (state=3): >>><<< 26264 1727204254.47395: done transferring module to remote 26264 1727204254.47397: _low_level_execute_command(): starting 26264 1727204254.47400: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204253.8892336-27593-19234131038885/ /root/.ansible/tmp/ansible-tmp-1727204253.8892336-27593-19234131038885/AnsiballZ_systemd.py && sleep 0' 26264 1727204254.47941: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204254.47945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204254.47989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204254.47992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204254.47995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204254.48072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204254.48078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204254.48081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204254.48127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204254.49824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204254.49897: stderr chunk (state=3): >>><<< 26264 1727204254.49900: stdout chunk (state=3): >>><<< 26264 1727204254.49919: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204254.49922: _low_level_execute_command(): starting 26264 1727204254.49927: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204253.8892336-27593-19234131038885/AnsiballZ_systemd.py && sleep 0' 26264 1727204254.51860: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204254.51870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204254.51883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204254.51900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204254.51940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204254.51948: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204254.51961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204254.51978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204254.51986: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204254.51994: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204254.52005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204254.52014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204254.52027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204254.52033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204254.52039: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204254.52048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204254.52127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204254.52146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204254.52155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204254.52334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204254.77199: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 26264 1727204254.77205: stdout chunk (state=3): >>>service", "ControlGroupId": "2418", "MemoryCurrent": "16179200", "MemoryAvailable": "infinity", "CPUUsageNSec": "1501449000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogS<<< 26264 1727204254.77214: stdout chunk (state=3): >>>ignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 26264 1727204254.78771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204254.78775: stdout chunk (state=3): >>><<< 26264 1727204254.78779: stderr chunk (state=3): >>><<< 26264 1727204254.78803: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "16179200", "MemoryAvailable": "infinity", "CPUUsageNSec": "1501449000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204254.78978: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204253.8892336-27593-19234131038885/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204254.79005: _low_level_execute_command(): starting 26264 1727204254.79010: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204253.8892336-27593-19234131038885/ > /dev/null 2>&1 && sleep 0' 26264 1727204254.80657: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204254.80668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204254.80682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204254.80692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204254.80737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204254.80745: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204254.80755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204254.80771: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204254.80779: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204254.80786: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204254.80793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204254.80802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204254.80815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204254.80825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204254.80833: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204254.80843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204254.80915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204254.80927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204254.80944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204254.81037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204254.82885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204254.82889: stdout chunk (state=3): >>><<< 26264 1727204254.82894: stderr chunk (state=3): >>><<< 26264 1727204254.82912: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204254.82920: handler run complete 26264 1727204254.83025: attempt loop complete, returning result 26264 1727204254.83029: _execute() done 26264 1727204254.83031: dumping result to json 26264 1727204254.83033: done dumping result, returning 26264 1727204254.83035: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-5ff5-08b0-000000000025] 26264 1727204254.83037: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000025 26264 1727204254.83413: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000025 26264 1727204254.83417: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204254.83466: no more pending results, returning what we have 26264 1727204254.83469: results queue empty 26264 1727204254.83469: checking for any_errors_fatal 26264 1727204254.83474: done checking for any_errors_fatal 26264 1727204254.83475: checking for max_fail_percentage 26264 1727204254.83476: done checking for max_fail_percentage 26264 1727204254.83477: checking to see if all hosts have failed and the running result is not ok 26264 1727204254.83478: done checking to see if all hosts have failed 26264 1727204254.83478: getting the remaining hosts for this loop 26264 1727204254.83480: done getting the remaining hosts for this loop 26264 1727204254.83483: getting the next task for host managed-node3 26264 1727204254.83488: done getting next task for host managed-node3 26264 1727204254.83492: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 26264 1727204254.83493: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204254.83501: getting variables 26264 1727204254.83503: in VariableManager get_vars() 26264 1727204254.83535: Calling all_inventory to load vars for managed-node3 26264 1727204254.83538: Calling groups_inventory to load vars for managed-node3 26264 1727204254.83540: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204254.83551: Calling all_plugins_play to load vars for managed-node3 26264 1727204254.83553: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204254.83556: Calling groups_plugins_play to load vars for managed-node3 26264 1727204254.85225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204254.87685: done with get_vars() 26264 1727204254.87713: done getting variables 26264 1727204254.87795: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:34 -0400 (0:00:01.151) 0:00:18.727 ***** 26264 1727204254.87828: entering _queue_task() for managed-node3/service 26264 1727204254.88179: worker is 1 (out of 1 available) 26264 1727204254.88192: exiting _queue_task() for managed-node3/service 26264 1727204254.88210: done queuing things up, now waiting for results queue to drain 26264 1727204254.88212: waiting for pending results... 26264 1727204254.88558: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 26264 1727204254.88731: in run() - task 0affcd87-79f5-5ff5-08b0-000000000026 26264 1727204254.88753: variable 'ansible_search_path' from source: unknown 26264 1727204254.88766: variable 'ansible_search_path' from source: unknown 26264 1727204254.88820: calling self._execute() 26264 1727204254.88934: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204254.88945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204254.88966: variable 'omit' from source: magic vars 26264 1727204254.89419: variable 'ansible_distribution_major_version' from source: facts 26264 1727204254.89438: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204254.89585: variable 'network_provider' from source: set_fact 26264 1727204254.89619: Evaluated conditional (network_provider == "nm"): True 26264 1727204254.89737: variable '__network_wpa_supplicant_required' from source: role '' defaults 26264 1727204254.89860: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26264 1727204254.90060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204254.95125: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204254.95398: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204254.95496: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204254.95632: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204254.95674: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204254.95811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204254.95862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204254.95899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204254.95961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204254.95985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204254.96040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204254.96087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204254.96117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204254.96176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204254.96243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204254.96306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204254.96335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204254.96371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204254.96428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204254.96452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204254.96635: variable 'network_connections' from source: play vars 26264 1727204254.96655: variable 'interface' from source: set_fact 26264 1727204254.96832: variable 'interface' from source: set_fact 26264 1727204254.96854: variable 'interface' from source: set_fact 26264 1727204254.97001: variable 'interface' from source: set_fact 26264 1727204254.97225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204254.97676: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204254.97867: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204254.98077: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204254.98156: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204254.98210: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204254.98254: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204254.98320: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204254.98480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204254.98530: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204254.99189: variable 'network_connections' from source: play vars 26264 1727204254.99231: variable 'interface' from source: set_fact 26264 1727204254.99652: variable 'interface' from source: set_fact 26264 1727204254.99668: variable 'interface' from source: set_fact 26264 1727204254.99822: variable 'interface' from source: set_fact 26264 1727204254.99974: Evaluated conditional (__network_wpa_supplicant_required): False 26264 1727204255.00028: when evaluation is False, skipping this task 26264 1727204255.00075: _execute() done 26264 1727204255.00094: dumping result to json 26264 1727204255.00101: done dumping result, returning 26264 1727204255.00116: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-5ff5-08b0-000000000026] 26264 1727204255.00183: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000026 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 26264 1727204255.00337: no more pending results, returning what we have 26264 1727204255.00341: results queue empty 26264 1727204255.00342: checking for any_errors_fatal 26264 1727204255.00371: done checking for any_errors_fatal 26264 1727204255.00373: checking for max_fail_percentage 26264 1727204255.00376: done checking for max_fail_percentage 26264 1727204255.00377: checking to see if all hosts have failed and the running result is not ok 26264 1727204255.00378: done checking to see if all hosts have failed 26264 1727204255.00378: getting the remaining hosts for this loop 26264 1727204255.00380: done getting the remaining hosts for this loop 26264 1727204255.00385: getting the next task for host managed-node3 26264 1727204255.00391: done getting next task for host managed-node3 26264 1727204255.00395: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 26264 1727204255.00397: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204255.00411: getting variables 26264 1727204255.00413: in VariableManager get_vars() 26264 1727204255.00454: Calling all_inventory to load vars for managed-node3 26264 1727204255.00457: Calling groups_inventory to load vars for managed-node3 26264 1727204255.00460: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204255.00481: Calling all_plugins_play to load vars for managed-node3 26264 1727204255.00485: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204255.00488: Calling groups_plugins_play to load vars for managed-node3 26264 1727204255.01834: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000026 26264 1727204255.01838: WORKER PROCESS EXITING 26264 1727204255.05375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204255.11685: done with get_vars() 26264 1727204255.11721: done getting variables 26264 1727204255.12017: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:35 -0400 (0:00:00.242) 0:00:18.969 ***** 26264 1727204255.12052: entering _queue_task() for managed-node3/service 26264 1727204255.13046: worker is 1 (out of 1 available) 26264 1727204255.13062: exiting _queue_task() for managed-node3/service 26264 1727204255.13077: done queuing things up, now waiting for results queue to drain 26264 1727204255.13079: waiting for pending results... 26264 1727204255.14459: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 26264 1727204255.14845: in run() - task 0affcd87-79f5-5ff5-08b0-000000000027 26264 1727204255.14962: variable 'ansible_search_path' from source: unknown 26264 1727204255.14974: variable 'ansible_search_path' from source: unknown 26264 1727204255.15016: calling self._execute() 26264 1727204255.15261: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204255.15391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204255.15503: variable 'omit' from source: magic vars 26264 1727204255.16808: variable 'ansible_distribution_major_version' from source: facts 26264 1727204255.16828: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204255.17186: variable 'network_provider' from source: set_fact 26264 1727204255.17236: Evaluated conditional (network_provider == "initscripts"): False 26264 1727204255.17246: when evaluation is False, skipping this task 26264 1727204255.17340: _execute() done 26264 1727204255.17353: dumping result to json 26264 1727204255.17447: done dumping result, returning 26264 1727204255.17476: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-5ff5-08b0-000000000027] 26264 1727204255.17492: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000027 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204255.17656: no more pending results, returning what we have 26264 1727204255.17661: results queue empty 26264 1727204255.17662: checking for any_errors_fatal 26264 1727204255.17672: done checking for any_errors_fatal 26264 1727204255.17673: checking for max_fail_percentage 26264 1727204255.17675: done checking for max_fail_percentage 26264 1727204255.17676: checking to see if all hosts have failed and the running result is not ok 26264 1727204255.17677: done checking to see if all hosts have failed 26264 1727204255.17678: getting the remaining hosts for this loop 26264 1727204255.17680: done getting the remaining hosts for this loop 26264 1727204255.17684: getting the next task for host managed-node3 26264 1727204255.17691: done getting next task for host managed-node3 26264 1727204255.17695: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 26264 1727204255.17697: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204255.17714: getting variables 26264 1727204255.17718: in VariableManager get_vars() 26264 1727204255.17762: Calling all_inventory to load vars for managed-node3 26264 1727204255.17767: Calling groups_inventory to load vars for managed-node3 26264 1727204255.17770: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204255.17783: Calling all_plugins_play to load vars for managed-node3 26264 1727204255.17786: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204255.17789: Calling groups_plugins_play to load vars for managed-node3 26264 1727204255.19066: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000027 26264 1727204255.19071: WORKER PROCESS EXITING 26264 1727204255.22684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204255.26388: done with get_vars() 26264 1727204255.26534: done getting variables 26264 1727204255.26598: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:35 -0400 (0:00:00.146) 0:00:19.116 ***** 26264 1727204255.26752: entering _queue_task() for managed-node3/copy 26264 1727204255.27421: worker is 1 (out of 1 available) 26264 1727204255.27433: exiting _queue_task() for managed-node3/copy 26264 1727204255.27445: done queuing things up, now waiting for results queue to drain 26264 1727204255.27446: waiting for pending results... 26264 1727204255.28301: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 26264 1727204255.28536: in run() - task 0affcd87-79f5-5ff5-08b0-000000000028 26264 1727204255.28587: variable 'ansible_search_path' from source: unknown 26264 1727204255.28594: variable 'ansible_search_path' from source: unknown 26264 1727204255.28631: calling self._execute() 26264 1727204255.28775: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204255.28907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204255.28920: variable 'omit' from source: magic vars 26264 1727204255.29634: variable 'ansible_distribution_major_version' from source: facts 26264 1727204255.29784: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204255.30024: variable 'network_provider' from source: set_fact 26264 1727204255.30035: Evaluated conditional (network_provider == "initscripts"): False 26264 1727204255.30043: when evaluation is False, skipping this task 26264 1727204255.30052: _execute() done 26264 1727204255.30059: dumping result to json 26264 1727204255.30069: done dumping result, returning 26264 1727204255.30080: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-5ff5-08b0-000000000028] 26264 1727204255.30096: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000028 skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 26264 1727204255.30244: no more pending results, returning what we have 26264 1727204255.30250: results queue empty 26264 1727204255.30251: checking for any_errors_fatal 26264 1727204255.30257: done checking for any_errors_fatal 26264 1727204255.30257: checking for max_fail_percentage 26264 1727204255.30259: done checking for max_fail_percentage 26264 1727204255.30260: checking to see if all hosts have failed and the running result is not ok 26264 1727204255.30261: done checking to see if all hosts have failed 26264 1727204255.30262: getting the remaining hosts for this loop 26264 1727204255.30265: done getting the remaining hosts for this loop 26264 1727204255.30269: getting the next task for host managed-node3 26264 1727204255.30275: done getting next task for host managed-node3 26264 1727204255.30279: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 26264 1727204255.30281: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204255.30296: getting variables 26264 1727204255.30298: in VariableManager get_vars() 26264 1727204255.30341: Calling all_inventory to load vars for managed-node3 26264 1727204255.30345: Calling groups_inventory to load vars for managed-node3 26264 1727204255.30350: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204255.30365: Calling all_plugins_play to load vars for managed-node3 26264 1727204255.30369: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204255.30372: Calling groups_plugins_play to load vars for managed-node3 26264 1727204255.31413: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000028 26264 1727204255.31416: WORKER PROCESS EXITING 26264 1727204255.33834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204255.37825: done with get_vars() 26264 1727204255.37865: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:35 -0400 (0:00:00.114) 0:00:19.231 ***** 26264 1727204255.38229: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 26264 1727204255.38231: Creating lock for fedora.linux_system_roles.network_connections 26264 1727204255.38989: worker is 1 (out of 1 available) 26264 1727204255.39002: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 26264 1727204255.39053: done queuing things up, now waiting for results queue to drain 26264 1727204255.39055: waiting for pending results... 26264 1727204255.40017: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 26264 1727204255.40362: in run() - task 0affcd87-79f5-5ff5-08b0-000000000029 26264 1727204255.40387: variable 'ansible_search_path' from source: unknown 26264 1727204255.40395: variable 'ansible_search_path' from source: unknown 26264 1727204255.40441: calling self._execute() 26264 1727204255.40667: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204255.40692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204255.40804: variable 'omit' from source: magic vars 26264 1727204255.41702: variable 'ansible_distribution_major_version' from source: facts 26264 1727204255.41721: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204255.41794: variable 'omit' from source: magic vars 26264 1727204255.41862: variable 'omit' from source: magic vars 26264 1727204255.42257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204255.48489: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204255.48630: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204255.48919: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204255.48975: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204255.49068: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204255.49340: variable 'network_provider' from source: set_fact 26264 1727204255.49890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204255.49923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204255.49955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204255.50090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204255.50118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204255.50285: variable 'omit' from source: magic vars 26264 1727204255.50531: variable 'omit' from source: magic vars 26264 1727204255.50878: variable 'network_connections' from source: play vars 26264 1727204255.50896: variable 'interface' from source: set_fact 26264 1727204255.51096: variable 'interface' from source: set_fact 26264 1727204255.51109: variable 'interface' from source: set_fact 26264 1727204255.51182: variable 'interface' from source: set_fact 26264 1727204255.51677: variable 'omit' from source: magic vars 26264 1727204255.51769: variable '__lsr_ansible_managed' from source: task vars 26264 1727204255.51833: variable '__lsr_ansible_managed' from source: task vars 26264 1727204255.52741: Loaded config def from plugin (lookup/template) 26264 1727204255.52804: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 26264 1727204255.52904: File lookup term: get_ansible_managed.j2 26264 1727204255.52914: variable 'ansible_search_path' from source: unknown 26264 1727204255.52925: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 26264 1727204255.52944: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 26264 1727204255.52982: variable 'ansible_search_path' from source: unknown 26264 1727204255.67228: variable 'ansible_managed' from source: unknown 26264 1727204255.67819: variable 'omit' from source: magic vars 26264 1727204255.68680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204255.68684: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204255.68686: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204255.68689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204255.68691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204255.68693: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204255.68695: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204255.68697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204255.68699: Set connection var ansible_pipelining to False 26264 1727204255.68701: Set connection var ansible_connection to ssh 26264 1727204255.68703: Set connection var ansible_shell_type to sh 26264 1727204255.68705: Set connection var ansible_shell_executable to /bin/sh 26264 1727204255.68707: Set connection var ansible_timeout to 10 26264 1727204255.68709: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204255.68710: variable 'ansible_shell_executable' from source: unknown 26264 1727204255.68712: variable 'ansible_connection' from source: unknown 26264 1727204255.68714: variable 'ansible_module_compression' from source: unknown 26264 1727204255.68716: variable 'ansible_shell_type' from source: unknown 26264 1727204255.68718: variable 'ansible_shell_executable' from source: unknown 26264 1727204255.68720: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204255.68722: variable 'ansible_pipelining' from source: unknown 26264 1727204255.68724: variable 'ansible_timeout' from source: unknown 26264 1727204255.68727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204255.69118: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204255.69129: variable 'omit' from source: magic vars 26264 1727204255.69132: starting attempt loop 26264 1727204255.69134: running the handler 26264 1727204255.69170: _low_level_execute_command(): starting 26264 1727204255.69174: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204255.70453: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204255.70505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204255.70521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204255.70678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204255.70682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204255.70685: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204255.70687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204255.70690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204255.70692: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204255.70700: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204255.70703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204255.70706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204255.70708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204255.70717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204255.70723: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204255.70733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204255.70821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204255.70837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204255.70968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204255.70974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204255.72544: stdout chunk (state=3): >>>/root <<< 26264 1727204255.72742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204255.72746: stdout chunk (state=3): >>><<< 26264 1727204255.72748: stderr chunk (state=3): >>><<< 26264 1727204255.72873: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204255.72876: _low_level_execute_command(): starting 26264 1727204255.72880: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204255.7280488-27748-108508125032777 `" && echo ansible-tmp-1727204255.7280488-27748-108508125032777="` echo /root/.ansible/tmp/ansible-tmp-1727204255.7280488-27748-108508125032777 `" ) && sleep 0' 26264 1727204255.74018: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204255.74035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204255.74052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204255.74076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204255.74122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204255.74136: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204255.74150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204255.74172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204255.74185: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204255.74198: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204255.74215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204255.74229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204255.74245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204255.74259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204255.74276: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204255.74290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204255.74370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204255.74423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204255.74442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204255.74577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204255.76386: stdout chunk (state=3): >>>ansible-tmp-1727204255.7280488-27748-108508125032777=/root/.ansible/tmp/ansible-tmp-1727204255.7280488-27748-108508125032777 <<< 26264 1727204255.76569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204255.76573: stdout chunk (state=3): >>><<< 26264 1727204255.76583: stderr chunk (state=3): >>><<< 26264 1727204255.76885: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204255.7280488-27748-108508125032777=/root/.ansible/tmp/ansible-tmp-1727204255.7280488-27748-108508125032777 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204255.76894: variable 'ansible_module_compression' from source: unknown 26264 1727204255.76896: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 26264 1727204255.76899: ANSIBALLZ: Acquiring lock 26264 1727204255.76901: ANSIBALLZ: Lock acquired: 139841044506880 26264 1727204255.76903: ANSIBALLZ: Creating module 26264 1727204256.11515: ANSIBALLZ: Writing module into payload 26264 1727204256.12000: ANSIBALLZ: Writing module 26264 1727204256.12030: ANSIBALLZ: Renaming module 26264 1727204256.12033: ANSIBALLZ: Done creating module 26264 1727204256.12063: variable 'ansible_facts' from source: unknown 26264 1727204256.12166: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204255.7280488-27748-108508125032777/AnsiballZ_network_connections.py 26264 1727204256.12322: Sending initial data 26264 1727204256.12326: Sent initial data (168 bytes) 26264 1727204256.13437: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204256.13447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204256.13461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204256.13477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204256.13519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204256.13526: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204256.13537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204256.13550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204256.13562: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204256.13572: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204256.13580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204256.13590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204256.13602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204256.13608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204256.13616: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204256.13631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204256.13705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204256.13756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204256.13771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204256.13850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204256.15660: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204256.15700: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204256.15735: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmp81jpim7k /root/.ansible/tmp/ansible-tmp-1727204255.7280488-27748-108508125032777/AnsiballZ_network_connections.py <<< 26264 1727204256.15771: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204256.17408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204256.17576: stderr chunk (state=3): >>><<< 26264 1727204256.17579: stdout chunk (state=3): >>><<< 26264 1727204256.17581: done transferring module to remote 26264 1727204256.17583: _low_level_execute_command(): starting 26264 1727204256.17586: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204255.7280488-27748-108508125032777/ /root/.ansible/tmp/ansible-tmp-1727204255.7280488-27748-108508125032777/AnsiballZ_network_connections.py && sleep 0' 26264 1727204256.18221: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204256.18224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204256.18261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204256.18267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204256.18269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204256.18272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204256.18329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204256.18347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204256.18429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204256.20175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204256.20178: stdout chunk (state=3): >>><<< 26264 1727204256.20180: stderr chunk (state=3): >>><<< 26264 1727204256.20263: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204256.20268: _low_level_execute_command(): starting 26264 1727204256.20271: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204255.7280488-27748-108508125032777/AnsiballZ_network_connections.py && sleep 0' 26264 1727204256.20804: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204256.20818: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204256.20832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204256.20856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204256.20899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204256.20913: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204256.20927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204256.20945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204256.20959: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204256.20975: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204256.20988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204256.21002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204256.21018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204256.21030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204256.21041: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204256.21054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204256.21130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204256.21147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204256.21163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204256.21284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204256.48281: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 28a85e21-2916-4a1f-bfbf-d47ba3d76f0d\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 28a85e21-2916-4a1f-bfbf-d47ba3d76f0d (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 26264 1727204256.50462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204256.50471: stderr chunk (state=3): >>><<< 26264 1727204256.50474: stdout chunk (state=3): >>><<< 26264 1727204256.50641: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 28a85e21-2916-4a1f-bfbf-d47ba3d76f0d\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 28a85e21-2916-4a1f-bfbf-d47ba3d76f0d (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204256.50645: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'interface_name': 'lsr27', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'address': '192.0.2.1/24'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204255.7280488-27748-108508125032777/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204256.50647: _low_level_execute_command(): starting 26264 1727204256.50650: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204255.7280488-27748-108508125032777/ > /dev/null 2>&1 && sleep 0' 26264 1727204256.53127: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204256.53197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204256.53206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204256.53220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204256.53263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204256.53278: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204256.53290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204256.53420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204256.53430: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204256.53436: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204256.53443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204256.53454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204256.53470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204256.53527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204256.53534: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204256.53546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204256.53620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204256.53870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204256.53966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204256.53973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204256.55872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204256.55876: stdout chunk (state=3): >>><<< 26264 1727204256.55879: stderr chunk (state=3): >>><<< 26264 1727204256.55881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204256.55884: handler run complete 26264 1727204256.55916: attempt loop complete, returning result 26264 1727204256.55919: _execute() done 26264 1727204256.55922: dumping result to json 26264 1727204256.55924: done dumping result, returning 26264 1727204256.55934: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-5ff5-08b0-000000000029] 26264 1727204256.55939: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000029 26264 1727204256.56059: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000029 26264 1727204256.56062: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 28a85e21-2916-4a1f-bfbf-d47ba3d76f0d [004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 28a85e21-2916-4a1f-bfbf-d47ba3d76f0d (not-active) 26264 1727204256.56284: no more pending results, returning what we have 26264 1727204256.56289: results queue empty 26264 1727204256.56290: checking for any_errors_fatal 26264 1727204256.56297: done checking for any_errors_fatal 26264 1727204256.56297: checking for max_fail_percentage 26264 1727204256.56299: done checking for max_fail_percentage 26264 1727204256.56300: checking to see if all hosts have failed and the running result is not ok 26264 1727204256.56301: done checking to see if all hosts have failed 26264 1727204256.56302: getting the remaining hosts for this loop 26264 1727204256.56304: done getting the remaining hosts for this loop 26264 1727204256.56309: getting the next task for host managed-node3 26264 1727204256.56315: done getting next task for host managed-node3 26264 1727204256.56319: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 26264 1727204256.56321: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204256.56333: getting variables 26264 1727204256.56335: in VariableManager get_vars() 26264 1727204256.56381: Calling all_inventory to load vars for managed-node3 26264 1727204256.56384: Calling groups_inventory to load vars for managed-node3 26264 1727204256.56387: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204256.56398: Calling all_plugins_play to load vars for managed-node3 26264 1727204256.56400: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204256.56403: Calling groups_plugins_play to load vars for managed-node3 26264 1727204256.59270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204256.63375: done with get_vars() 26264 1727204256.63407: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:36 -0400 (0:00:01.252) 0:00:20.484 ***** 26264 1727204256.63493: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 26264 1727204256.63495: Creating lock for fedora.linux_system_roles.network_state 26264 1727204256.64521: worker is 1 (out of 1 available) 26264 1727204256.64535: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 26264 1727204256.64551: done queuing things up, now waiting for results queue to drain 26264 1727204256.64553: waiting for pending results... 26264 1727204256.65704: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 26264 1727204256.65870: in run() - task 0affcd87-79f5-5ff5-08b0-00000000002a 26264 1727204256.65886: variable 'ansible_search_path' from source: unknown 26264 1727204256.65890: variable 'ansible_search_path' from source: unknown 26264 1727204256.65928: calling self._execute() 26264 1727204256.66133: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204256.66137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204256.66147: variable 'omit' from source: magic vars 26264 1727204256.66984: variable 'ansible_distribution_major_version' from source: facts 26264 1727204256.66999: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204256.67229: variable 'network_state' from source: role '' defaults 26264 1727204256.67240: Evaluated conditional (network_state != {}): False 26264 1727204256.67244: when evaluation is False, skipping this task 26264 1727204256.67247: _execute() done 26264 1727204256.67249: dumping result to json 26264 1727204256.67255: done dumping result, returning 26264 1727204256.67263: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-5ff5-08b0-00000000002a] 26264 1727204256.67374: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000002a 26264 1727204256.67469: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000002a 26264 1727204256.67472: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204256.67534: no more pending results, returning what we have 26264 1727204256.67539: results queue empty 26264 1727204256.67539: checking for any_errors_fatal 26264 1727204256.67552: done checking for any_errors_fatal 26264 1727204256.67552: checking for max_fail_percentage 26264 1727204256.67554: done checking for max_fail_percentage 26264 1727204256.67555: checking to see if all hosts have failed and the running result is not ok 26264 1727204256.67556: done checking to see if all hosts have failed 26264 1727204256.67557: getting the remaining hosts for this loop 26264 1727204256.67559: done getting the remaining hosts for this loop 26264 1727204256.67565: getting the next task for host managed-node3 26264 1727204256.67573: done getting next task for host managed-node3 26264 1727204256.67578: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 26264 1727204256.67580: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204256.67597: getting variables 26264 1727204256.67598: in VariableManager get_vars() 26264 1727204256.67639: Calling all_inventory to load vars for managed-node3 26264 1727204256.67642: Calling groups_inventory to load vars for managed-node3 26264 1727204256.67645: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204256.67661: Calling all_plugins_play to load vars for managed-node3 26264 1727204256.67666: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204256.67670: Calling groups_plugins_play to load vars for managed-node3 26264 1727204256.70417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204256.74963: done with get_vars() 26264 1727204256.74993: done getting variables 26264 1727204256.75058: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:36 -0400 (0:00:00.121) 0:00:20.605 ***** 26264 1727204256.75618: entering _queue_task() for managed-node3/debug 26264 1727204256.75940: worker is 1 (out of 1 available) 26264 1727204256.75954: exiting _queue_task() for managed-node3/debug 26264 1727204256.75969: done queuing things up, now waiting for results queue to drain 26264 1727204256.75970: waiting for pending results... 26264 1727204256.76837: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 26264 1727204256.77079: in run() - task 0affcd87-79f5-5ff5-08b0-00000000002b 26264 1727204256.77189: variable 'ansible_search_path' from source: unknown 26264 1727204256.77197: variable 'ansible_search_path' from source: unknown 26264 1727204256.77243: calling self._execute() 26264 1727204256.77426: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204256.77479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204256.77562: variable 'omit' from source: magic vars 26264 1727204256.78990: variable 'ansible_distribution_major_version' from source: facts 26264 1727204256.79155: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204256.79255: variable 'omit' from source: magic vars 26264 1727204256.79300: variable 'omit' from source: magic vars 26264 1727204256.79394: variable 'omit' from source: magic vars 26264 1727204256.79611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204256.79716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204256.79741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204256.79809: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204256.79983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204256.80021: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204256.80031: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204256.80043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204256.80485: Set connection var ansible_pipelining to False 26264 1727204256.80493: Set connection var ansible_connection to ssh 26264 1727204256.80499: Set connection var ansible_shell_type to sh 26264 1727204256.80509: Set connection var ansible_shell_executable to /bin/sh 26264 1727204256.80520: Set connection var ansible_timeout to 10 26264 1727204256.80530: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204256.80569: variable 'ansible_shell_executable' from source: unknown 26264 1727204256.80772: variable 'ansible_connection' from source: unknown 26264 1727204256.80780: variable 'ansible_module_compression' from source: unknown 26264 1727204256.80787: variable 'ansible_shell_type' from source: unknown 26264 1727204256.80794: variable 'ansible_shell_executable' from source: unknown 26264 1727204256.80802: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204256.80809: variable 'ansible_pipelining' from source: unknown 26264 1727204256.80814: variable 'ansible_timeout' from source: unknown 26264 1727204256.80821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204256.80973: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204256.81086: variable 'omit' from source: magic vars 26264 1727204256.81209: starting attempt loop 26264 1727204256.81214: running the handler 26264 1727204256.81362: variable '__network_connections_result' from source: set_fact 26264 1727204256.81585: handler run complete 26264 1727204256.81608: attempt loop complete, returning result 26264 1727204256.81616: _execute() done 26264 1727204256.81623: dumping result to json 26264 1727204256.81636: done dumping result, returning 26264 1727204256.81651: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-5ff5-08b0-00000000002b] 26264 1727204256.81662: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000002b ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 28a85e21-2916-4a1f-bfbf-d47ba3d76f0d", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 28a85e21-2916-4a1f-bfbf-d47ba3d76f0d (not-active)" ] } 26264 1727204256.81906: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000002b 26264 1727204256.81911: WORKER PROCESS EXITING 26264 1727204256.81918: no more pending results, returning what we have 26264 1727204256.81922: results queue empty 26264 1727204256.81922: checking for any_errors_fatal 26264 1727204256.81928: done checking for any_errors_fatal 26264 1727204256.81929: checking for max_fail_percentage 26264 1727204256.81930: done checking for max_fail_percentage 26264 1727204256.81931: checking to see if all hosts have failed and the running result is not ok 26264 1727204256.81933: done checking to see if all hosts have failed 26264 1727204256.81933: getting the remaining hosts for this loop 26264 1727204256.81936: done getting the remaining hosts for this loop 26264 1727204256.81940: getting the next task for host managed-node3 26264 1727204256.81946: done getting next task for host managed-node3 26264 1727204256.81950: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 26264 1727204256.81952: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204256.81962: getting variables 26264 1727204256.81965: in VariableManager get_vars() 26264 1727204256.82002: Calling all_inventory to load vars for managed-node3 26264 1727204256.82005: Calling groups_inventory to load vars for managed-node3 26264 1727204256.82007: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204256.82017: Calling all_plugins_play to load vars for managed-node3 26264 1727204256.82020: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204256.82022: Calling groups_plugins_play to load vars for managed-node3 26264 1727204256.84793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204256.88303: done with get_vars() 26264 1727204256.88336: done getting variables 26264 1727204256.88522: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:36 -0400 (0:00:00.129) 0:00:20.734 ***** 26264 1727204256.88554: entering _queue_task() for managed-node3/debug 26264 1727204256.89224: worker is 1 (out of 1 available) 26264 1727204256.89238: exiting _queue_task() for managed-node3/debug 26264 1727204256.89252: done queuing things up, now waiting for results queue to drain 26264 1727204256.89253: waiting for pending results... 26264 1727204256.90195: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 26264 1727204256.90405: in run() - task 0affcd87-79f5-5ff5-08b0-00000000002c 26264 1727204256.90417: variable 'ansible_search_path' from source: unknown 26264 1727204256.90421: variable 'ansible_search_path' from source: unknown 26264 1727204256.90456: calling self._execute() 26264 1727204256.90656: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204256.90660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204256.90672: variable 'omit' from source: magic vars 26264 1727204256.91523: variable 'ansible_distribution_major_version' from source: facts 26264 1727204256.91542: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204256.91570: variable 'omit' from source: magic vars 26264 1727204256.91612: variable 'omit' from source: magic vars 26264 1727204256.91761: variable 'omit' from source: magic vars 26264 1727204256.91882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204256.92014: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204256.92043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204256.92073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204256.92097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204256.92173: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204256.92181: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204256.92276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204256.92408: Set connection var ansible_pipelining to False 26264 1727204256.92417: Set connection var ansible_connection to ssh 26264 1727204256.92424: Set connection var ansible_shell_type to sh 26264 1727204256.92436: Set connection var ansible_shell_executable to /bin/sh 26264 1727204256.92450: Set connection var ansible_timeout to 10 26264 1727204256.92466: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204256.92494: variable 'ansible_shell_executable' from source: unknown 26264 1727204256.92503: variable 'ansible_connection' from source: unknown 26264 1727204256.92513: variable 'ansible_module_compression' from source: unknown 26264 1727204256.92519: variable 'ansible_shell_type' from source: unknown 26264 1727204256.92525: variable 'ansible_shell_executable' from source: unknown 26264 1727204256.92531: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204256.92537: variable 'ansible_pipelining' from source: unknown 26264 1727204256.92545: variable 'ansible_timeout' from source: unknown 26264 1727204256.92555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204256.92706: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204256.92725: variable 'omit' from source: magic vars 26264 1727204256.92738: starting attempt loop 26264 1727204256.92745: running the handler 26264 1727204256.92799: variable '__network_connections_result' from source: set_fact 26264 1727204256.92890: variable '__network_connections_result' from source: set_fact 26264 1727204256.93027: handler run complete 26264 1727204256.93068: attempt loop complete, returning result 26264 1727204256.93077: _execute() done 26264 1727204256.93084: dumping result to json 26264 1727204256.93093: done dumping result, returning 26264 1727204256.93105: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-5ff5-08b0-00000000002c] 26264 1727204256.93113: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000002c 26264 1727204256.93214: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000002c ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 28a85e21-2916-4a1f-bfbf-d47ba3d76f0d\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 28a85e21-2916-4a1f-bfbf-d47ba3d76f0d (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 28a85e21-2916-4a1f-bfbf-d47ba3d76f0d", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 28a85e21-2916-4a1f-bfbf-d47ba3d76f0d (not-active)" ] } } 26264 1727204256.93302: no more pending results, returning what we have 26264 1727204256.93307: results queue empty 26264 1727204256.93307: checking for any_errors_fatal 26264 1727204256.93313: done checking for any_errors_fatal 26264 1727204256.93314: checking for max_fail_percentage 26264 1727204256.93316: done checking for max_fail_percentage 26264 1727204256.93317: checking to see if all hosts have failed and the running result is not ok 26264 1727204256.93318: done checking to see if all hosts have failed 26264 1727204256.93318: getting the remaining hosts for this loop 26264 1727204256.93320: done getting the remaining hosts for this loop 26264 1727204256.93324: getting the next task for host managed-node3 26264 1727204256.93330: done getting next task for host managed-node3 26264 1727204256.93334: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 26264 1727204256.93335: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204256.93351: getting variables 26264 1727204256.93353: in VariableManager get_vars() 26264 1727204256.93390: Calling all_inventory to load vars for managed-node3 26264 1727204256.93392: Calling groups_inventory to load vars for managed-node3 26264 1727204256.93395: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204256.93405: Calling all_plugins_play to load vars for managed-node3 26264 1727204256.93408: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204256.93411: Calling groups_plugins_play to load vars for managed-node3 26264 1727204256.94178: WORKER PROCESS EXITING 26264 1727204256.95453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204256.97333: done with get_vars() 26264 1727204256.97378: done getting variables 26264 1727204256.97443: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:36 -0400 (0:00:00.089) 0:00:20.824 ***** 26264 1727204256.97487: entering _queue_task() for managed-node3/debug 26264 1727204256.97922: worker is 1 (out of 1 available) 26264 1727204256.97935: exiting _queue_task() for managed-node3/debug 26264 1727204256.97952: done queuing things up, now waiting for results queue to drain 26264 1727204256.97953: waiting for pending results... 26264 1727204256.98379: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 26264 1727204256.98603: in run() - task 0affcd87-79f5-5ff5-08b0-00000000002d 26264 1727204256.98700: variable 'ansible_search_path' from source: unknown 26264 1727204256.98708: variable 'ansible_search_path' from source: unknown 26264 1727204256.98754: calling self._execute() 26264 1727204256.98870: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204256.98881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204256.98896: variable 'omit' from source: magic vars 26264 1727204256.99502: variable 'ansible_distribution_major_version' from source: facts 26264 1727204256.99596: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204256.99733: variable 'network_state' from source: role '' defaults 26264 1727204256.99746: Evaluated conditional (network_state != {}): False 26264 1727204256.99755: when evaluation is False, skipping this task 26264 1727204256.99761: _execute() done 26264 1727204256.99768: dumping result to json 26264 1727204256.99774: done dumping result, returning 26264 1727204256.99783: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-5ff5-08b0-00000000002d] 26264 1727204256.99799: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000002d 26264 1727204256.99917: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000002d skipping: [managed-node3] => { "false_condition": "network_state != {}" } 26264 1727204256.99976: no more pending results, returning what we have 26264 1727204256.99981: results queue empty 26264 1727204256.99982: checking for any_errors_fatal 26264 1727204256.99990: done checking for any_errors_fatal 26264 1727204256.99991: checking for max_fail_percentage 26264 1727204256.99993: done checking for max_fail_percentage 26264 1727204256.99995: checking to see if all hosts have failed and the running result is not ok 26264 1727204256.99996: done checking to see if all hosts have failed 26264 1727204256.99997: getting the remaining hosts for this loop 26264 1727204256.99998: done getting the remaining hosts for this loop 26264 1727204257.00003: getting the next task for host managed-node3 26264 1727204257.00010: done getting next task for host managed-node3 26264 1727204257.00016: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 26264 1727204257.00019: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204257.00035: getting variables 26264 1727204257.00037: in VariableManager get_vars() 26264 1727204257.00088: Calling all_inventory to load vars for managed-node3 26264 1727204257.00092: Calling groups_inventory to load vars for managed-node3 26264 1727204257.00094: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204257.00108: Calling all_plugins_play to load vars for managed-node3 26264 1727204257.00111: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204257.00114: Calling groups_plugins_play to load vars for managed-node3 26264 1727204257.01132: WORKER PROCESS EXITING 26264 1727204257.03023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204257.05426: done with get_vars() 26264 1727204257.05469: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.080) 0:00:20.904 ***** 26264 1727204257.05586: entering _queue_task() for managed-node3/ping 26264 1727204257.05588: Creating lock for ping 26264 1727204257.06087: worker is 1 (out of 1 available) 26264 1727204257.06101: exiting _queue_task() for managed-node3/ping 26264 1727204257.06114: done queuing things up, now waiting for results queue to drain 26264 1727204257.06116: waiting for pending results... 26264 1727204257.06418: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 26264 1727204257.06538: in run() - task 0affcd87-79f5-5ff5-08b0-00000000002e 26264 1727204257.06573: variable 'ansible_search_path' from source: unknown 26264 1727204257.06614: variable 'ansible_search_path' from source: unknown 26264 1727204257.06662: calling self._execute() 26264 1727204257.06929: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204257.06945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204257.06962: variable 'omit' from source: magic vars 26264 1727204257.07842: variable 'ansible_distribution_major_version' from source: facts 26264 1727204257.07868: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204257.07886: variable 'omit' from source: magic vars 26264 1727204257.07937: variable 'omit' from source: magic vars 26264 1727204257.07986: variable 'omit' from source: magic vars 26264 1727204257.08089: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204257.08213: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204257.08298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204257.08322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204257.08371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204257.08488: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204257.08497: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204257.08505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204257.08979: Set connection var ansible_pipelining to False 26264 1727204257.08984: Set connection var ansible_connection to ssh 26264 1727204257.08986: Set connection var ansible_shell_type to sh 26264 1727204257.08993: Set connection var ansible_shell_executable to /bin/sh 26264 1727204257.09000: Set connection var ansible_timeout to 10 26264 1727204257.09007: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204257.09032: variable 'ansible_shell_executable' from source: unknown 26264 1727204257.09035: variable 'ansible_connection' from source: unknown 26264 1727204257.09038: variable 'ansible_module_compression' from source: unknown 26264 1727204257.09040: variable 'ansible_shell_type' from source: unknown 26264 1727204257.09042: variable 'ansible_shell_executable' from source: unknown 26264 1727204257.09044: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204257.09051: variable 'ansible_pipelining' from source: unknown 26264 1727204257.09054: variable 'ansible_timeout' from source: unknown 26264 1727204257.09056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204257.09244: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204257.09255: variable 'omit' from source: magic vars 26264 1727204257.09261: starting attempt loop 26264 1727204257.09265: running the handler 26264 1727204257.09278: _low_level_execute_command(): starting 26264 1727204257.09286: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204257.09955: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204257.09968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204257.09982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204257.09992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204257.10032: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204257.10040: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204257.10053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204257.10063: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204257.10074: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204257.10080: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204257.10089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204257.10097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204257.10108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204257.10116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204257.10122: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204257.10131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204257.10198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204257.10219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204257.10258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204257.10365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204257.11987: stdout chunk (state=3): >>>/root <<< 26264 1727204257.12164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204257.12177: stdout chunk (state=3): >>><<< 26264 1727204257.12180: stderr chunk (state=3): >>><<< 26264 1727204257.12199: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204257.12213: _low_level_execute_command(): starting 26264 1727204257.12219: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204257.1219985-28123-68712304081995 `" && echo ansible-tmp-1727204257.1219985-28123-68712304081995="` echo /root/.ansible/tmp/ansible-tmp-1727204257.1219985-28123-68712304081995 `" ) && sleep 0' 26264 1727204257.13043: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204257.13384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204257.15057: stdout chunk (state=3): >>>ansible-tmp-1727204257.1219985-28123-68712304081995=/root/.ansible/tmp/ansible-tmp-1727204257.1219985-28123-68712304081995 <<< 26264 1727204257.15263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204257.15269: stdout chunk (state=3): >>><<< 26264 1727204257.15272: stderr chunk (state=3): >>><<< 26264 1727204257.15475: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204257.1219985-28123-68712304081995=/root/.ansible/tmp/ansible-tmp-1727204257.1219985-28123-68712304081995 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204257.15479: variable 'ansible_module_compression' from source: unknown 26264 1727204257.15481: ANSIBALLZ: Using lock for ping 26264 1727204257.15483: ANSIBALLZ: Acquiring lock 26264 1727204257.15485: ANSIBALLZ: Lock acquired: 139841024706272 26264 1727204257.15487: ANSIBALLZ: Creating module 26264 1727204257.34239: ANSIBALLZ: Writing module into payload 26264 1727204257.34308: ANSIBALLZ: Writing module 26264 1727204257.34335: ANSIBALLZ: Renaming module 26264 1727204257.34341: ANSIBALLZ: Done creating module 26264 1727204257.34362: variable 'ansible_facts' from source: unknown 26264 1727204257.34435: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204257.1219985-28123-68712304081995/AnsiballZ_ping.py 26264 1727204257.34583: Sending initial data 26264 1727204257.34586: Sent initial data (152 bytes) 26264 1727204257.35583: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204257.35593: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204257.35604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204257.35646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204257.35690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204257.35697: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204257.35708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204257.35723: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204257.35736: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204257.35742: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204257.35749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204257.35767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204257.35775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204257.35783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204257.35789: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204257.35799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204257.35881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204257.35900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204257.35912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204257.35984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204257.37769: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204257.37807: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204257.37862: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmptdhq76sa /root/.ansible/tmp/ansible-tmp-1727204257.1219985-28123-68712304081995/AnsiballZ_ping.py <<< 26264 1727204257.37896: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204257.38975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204257.39061: stderr chunk (state=3): >>><<< 26264 1727204257.39067: stdout chunk (state=3): >>><<< 26264 1727204257.39091: done transferring module to remote 26264 1727204257.39103: _low_level_execute_command(): starting 26264 1727204257.39108: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204257.1219985-28123-68712304081995/ /root/.ansible/tmp/ansible-tmp-1727204257.1219985-28123-68712304081995/AnsiballZ_ping.py && sleep 0' 26264 1727204257.39786: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204257.39795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204257.39805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204257.39817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204257.39867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204257.39875: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204257.39884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204257.39897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204257.39904: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204257.39910: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204257.39918: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204257.39927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204257.39938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204257.39956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204257.39962: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204257.39973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204257.40037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204257.40057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204257.40077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204257.40144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204257.41911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204257.41915: stdout chunk (state=3): >>><<< 26264 1727204257.41922: stderr chunk (state=3): >>><<< 26264 1727204257.41939: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204257.41942: _low_level_execute_command(): starting 26264 1727204257.41946: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204257.1219985-28123-68712304081995/AnsiballZ_ping.py && sleep 0' 26264 1727204257.42632: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204257.42638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204257.42649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204257.42669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204257.42710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204257.42723: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204257.42741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204257.42757: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204257.42766: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204257.42778: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204257.42786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204257.42795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204257.42806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204257.42814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204257.42820: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204257.42830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204257.42909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204257.42916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204257.42919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204257.42998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204257.55839: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 26264 1727204257.56921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204257.56925: stdout chunk (state=3): >>><<< 26264 1727204257.56928: stderr chunk (state=3): >>><<< 26264 1727204257.57070: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204257.57074: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204257.1219985-28123-68712304081995/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204257.57077: _low_level_execute_command(): starting 26264 1727204257.57083: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204257.1219985-28123-68712304081995/ > /dev/null 2>&1 && sleep 0' 26264 1727204257.57709: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204257.57726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204257.57743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204257.57761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204257.57805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204257.57816: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204257.57835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204257.57854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204257.57867: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204257.57877: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204257.57887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204257.57898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204257.57911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204257.57921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204257.57935: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204257.57947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204257.58028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204257.58056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204257.58078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204257.58175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204257.59960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204257.60057: stderr chunk (state=3): >>><<< 26264 1727204257.60072: stdout chunk (state=3): >>><<< 26264 1727204257.60474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204257.60477: handler run complete 26264 1727204257.60480: attempt loop complete, returning result 26264 1727204257.60482: _execute() done 26264 1727204257.60484: dumping result to json 26264 1727204257.60486: done dumping result, returning 26264 1727204257.60489: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-5ff5-08b0-00000000002e] 26264 1727204257.60491: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000002e 26264 1727204257.60561: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000002e ok: [managed-node3] => { "changed": false, "ping": "pong" } 26264 1727204257.60615: WORKER PROCESS EXITING 26264 1727204257.60623: no more pending results, returning what we have 26264 1727204257.60627: results queue empty 26264 1727204257.60627: checking for any_errors_fatal 26264 1727204257.60632: done checking for any_errors_fatal 26264 1727204257.60633: checking for max_fail_percentage 26264 1727204257.60635: done checking for max_fail_percentage 26264 1727204257.60636: checking to see if all hosts have failed and the running result is not ok 26264 1727204257.60637: done checking to see if all hosts have failed 26264 1727204257.60638: getting the remaining hosts for this loop 26264 1727204257.60640: done getting the remaining hosts for this loop 26264 1727204257.60644: getting the next task for host managed-node3 26264 1727204257.60654: done getting next task for host managed-node3 26264 1727204257.60656: ^ task is: TASK: meta (role_complete) 26264 1727204257.60658: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204257.60671: getting variables 26264 1727204257.60672: in VariableManager get_vars() 26264 1727204257.60710: Calling all_inventory to load vars for managed-node3 26264 1727204257.60713: Calling groups_inventory to load vars for managed-node3 26264 1727204257.60716: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204257.60727: Calling all_plugins_play to load vars for managed-node3 26264 1727204257.60730: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204257.60734: Calling groups_plugins_play to load vars for managed-node3 26264 1727204257.62419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204257.64183: done with get_vars() 26264 1727204257.64211: done getting variables 26264 1727204257.64308: done queuing things up, now waiting for results queue to drain 26264 1727204257.64310: results queue empty 26264 1727204257.64311: checking for any_errors_fatal 26264 1727204257.64314: done checking for any_errors_fatal 26264 1727204257.64315: checking for max_fail_percentage 26264 1727204257.64317: done checking for max_fail_percentage 26264 1727204257.64317: checking to see if all hosts have failed and the running result is not ok 26264 1727204257.64318: done checking to see if all hosts have failed 26264 1727204257.64319: getting the remaining hosts for this loop 26264 1727204257.64320: done getting the remaining hosts for this loop 26264 1727204257.64323: getting the next task for host managed-node3 26264 1727204257.64327: done getting next task for host managed-node3 26264 1727204257.64330: ^ task is: TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 26264 1727204257.64331: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204257.64338: getting variables 26264 1727204257.64339: in VariableManager get_vars() 26264 1727204257.64356: Calling all_inventory to load vars for managed-node3 26264 1727204257.64358: Calling groups_inventory to load vars for managed-node3 26264 1727204257.64361: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204257.64374: Calling all_plugins_play to load vars for managed-node3 26264 1727204257.64376: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204257.64379: Calling groups_plugins_play to load vars for managed-node3 26264 1727204257.65623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204257.67424: done with get_vars() 26264 1727204257.67465: done getting variables TASK [Include the task 'assert_output_in_stderr_without_warnings.yml'] ********* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.619) 0:00:21.524 ***** 26264 1727204257.67561: entering _queue_task() for managed-node3/include_tasks 26264 1727204257.67929: worker is 1 (out of 1 available) 26264 1727204257.67941: exiting _queue_task() for managed-node3/include_tasks 26264 1727204257.67957: done queuing things up, now waiting for results queue to drain 26264 1727204257.67959: waiting for pending results... 26264 1727204257.68250: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 26264 1727204257.68346: in run() - task 0affcd87-79f5-5ff5-08b0-000000000030 26264 1727204257.68360: variable 'ansible_search_path' from source: unknown 26264 1727204257.68402: calling self._execute() 26264 1727204257.68494: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204257.68497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204257.68511: variable 'omit' from source: magic vars 26264 1727204257.68903: variable 'ansible_distribution_major_version' from source: facts 26264 1727204257.68920: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204257.68926: _execute() done 26264 1727204257.68929: dumping result to json 26264 1727204257.68935: done dumping result, returning 26264 1727204257.68943: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' [0affcd87-79f5-5ff5-08b0-000000000030] 26264 1727204257.68949: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000030 26264 1727204257.69073: no more pending results, returning what we have 26264 1727204257.69078: in VariableManager get_vars() 26264 1727204257.69122: Calling all_inventory to load vars for managed-node3 26264 1727204257.69126: Calling groups_inventory to load vars for managed-node3 26264 1727204257.69128: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204257.69145: Calling all_plugins_play to load vars for managed-node3 26264 1727204257.69151: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204257.69156: Calling groups_plugins_play to load vars for managed-node3 26264 1727204257.70037: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000030 26264 1727204257.70041: WORKER PROCESS EXITING 26264 1727204257.71061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204257.72956: done with get_vars() 26264 1727204257.72989: variable 'ansible_search_path' from source: unknown 26264 1727204257.73007: we have included files to process 26264 1727204257.73008: generating all_blocks data 26264 1727204257.73010: done generating all_blocks data 26264 1727204257.73015: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 26264 1727204257.73016: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 26264 1727204257.73019: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 26264 1727204257.73378: done processing included file 26264 1727204257.73381: iterating over new_blocks loaded from include file 26264 1727204257.73382: in VariableManager get_vars() 26264 1727204257.73399: done with get_vars() 26264 1727204257.73401: filtering new block on tags 26264 1727204257.73425: done filtering new block on tags 26264 1727204257.73428: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml for managed-node3 26264 1727204257.73434: extending task lists for all hosts with included blocks 26264 1727204257.73470: done extending task lists 26264 1727204257.73471: done processing included files 26264 1727204257.73473: results queue empty 26264 1727204257.73473: checking for any_errors_fatal 26264 1727204257.73475: done checking for any_errors_fatal 26264 1727204257.73476: checking for max_fail_percentage 26264 1727204257.73477: done checking for max_fail_percentage 26264 1727204257.73478: checking to see if all hosts have failed and the running result is not ok 26264 1727204257.73479: done checking to see if all hosts have failed 26264 1727204257.73480: getting the remaining hosts for this loop 26264 1727204257.73481: done getting the remaining hosts for this loop 26264 1727204257.73483: getting the next task for host managed-node3 26264 1727204257.73488: done getting next task for host managed-node3 26264 1727204257.73490: ^ task is: TASK: Assert that warnings is empty 26264 1727204257.73492: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204257.73495: getting variables 26264 1727204257.73496: in VariableManager get_vars() 26264 1727204257.73508: Calling all_inventory to load vars for managed-node3 26264 1727204257.73510: Calling groups_inventory to load vars for managed-node3 26264 1727204257.73512: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204257.73517: Calling all_plugins_play to load vars for managed-node3 26264 1727204257.73520: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204257.73523: Calling groups_plugins_play to load vars for managed-node3 26264 1727204257.75797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204257.77721: done with get_vars() 26264 1727204257.77760: done getting variables 26264 1727204257.77809: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that warnings is empty] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:3 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.102) 0:00:21.627 ***** 26264 1727204257.77841: entering _queue_task() for managed-node3/assert 26264 1727204257.78202: worker is 1 (out of 1 available) 26264 1727204257.78214: exiting _queue_task() for managed-node3/assert 26264 1727204257.78226: done queuing things up, now waiting for results queue to drain 26264 1727204257.78228: waiting for pending results... 26264 1727204257.78685: running TaskExecutor() for managed-node3/TASK: Assert that warnings is empty 26264 1727204257.78708: in run() - task 0affcd87-79f5-5ff5-08b0-000000000304 26264 1727204257.78722: variable 'ansible_search_path' from source: unknown 26264 1727204257.78726: variable 'ansible_search_path' from source: unknown 26264 1727204257.78770: calling self._execute() 26264 1727204257.78866: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204257.78870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204257.78881: variable 'omit' from source: magic vars 26264 1727204257.79281: variable 'ansible_distribution_major_version' from source: facts 26264 1727204257.79292: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204257.79298: variable 'omit' from source: magic vars 26264 1727204257.79339: variable 'omit' from source: magic vars 26264 1727204257.79392: variable 'omit' from source: magic vars 26264 1727204257.79430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204257.79474: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204257.79499: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204257.79517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204257.79530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204257.79570: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204257.79575: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204257.79577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204257.79687: Set connection var ansible_pipelining to False 26264 1727204257.79691: Set connection var ansible_connection to ssh 26264 1727204257.79693: Set connection var ansible_shell_type to sh 26264 1727204257.79698: Set connection var ansible_shell_executable to /bin/sh 26264 1727204257.79713: Set connection var ansible_timeout to 10 26264 1727204257.79720: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204257.79743: variable 'ansible_shell_executable' from source: unknown 26264 1727204257.79747: variable 'ansible_connection' from source: unknown 26264 1727204257.79749: variable 'ansible_module_compression' from source: unknown 26264 1727204257.79753: variable 'ansible_shell_type' from source: unknown 26264 1727204257.79756: variable 'ansible_shell_executable' from source: unknown 26264 1727204257.79760: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204257.79762: variable 'ansible_pipelining' from source: unknown 26264 1727204257.79764: variable 'ansible_timeout' from source: unknown 26264 1727204257.79771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204257.79920: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204257.80070: variable 'omit' from source: magic vars 26264 1727204257.80074: starting attempt loop 26264 1727204257.80077: running the handler 26264 1727204257.80192: variable '__network_connections_result' from source: set_fact 26264 1727204257.80195: Evaluated conditional ('warnings' not in __network_connections_result): True 26264 1727204257.80198: handler run complete 26264 1727204257.80200: attempt loop complete, returning result 26264 1727204257.80202: _execute() done 26264 1727204257.80205: dumping result to json 26264 1727204257.80207: done dumping result, returning 26264 1727204257.80209: done running TaskExecutor() for managed-node3/TASK: Assert that warnings is empty [0affcd87-79f5-5ff5-08b0-000000000304] 26264 1727204257.80211: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000304 26264 1727204257.80285: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000304 26264 1727204257.80289: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 26264 1727204257.80518: no more pending results, returning what we have 26264 1727204257.80523: results queue empty 26264 1727204257.80524: checking for any_errors_fatal 26264 1727204257.80526: done checking for any_errors_fatal 26264 1727204257.80526: checking for max_fail_percentage 26264 1727204257.80528: done checking for max_fail_percentage 26264 1727204257.80530: checking to see if all hosts have failed and the running result is not ok 26264 1727204257.80531: done checking to see if all hosts have failed 26264 1727204257.80532: getting the remaining hosts for this loop 26264 1727204257.80534: done getting the remaining hosts for this loop 26264 1727204257.80538: getting the next task for host managed-node3 26264 1727204257.80546: done getting next task for host managed-node3 26264 1727204257.80552: ^ task is: TASK: Assert that there is output in stderr 26264 1727204257.80555: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204257.80559: getting variables 26264 1727204257.80561: in VariableManager get_vars() 26264 1727204257.80606: Calling all_inventory to load vars for managed-node3 26264 1727204257.80610: Calling groups_inventory to load vars for managed-node3 26264 1727204257.80612: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204257.80624: Calling all_plugins_play to load vars for managed-node3 26264 1727204257.80627: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204257.80629: Calling groups_plugins_play to load vars for managed-node3 26264 1727204257.83623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204257.86740: done with get_vars() 26264 1727204257.86778: done getting variables 26264 1727204257.86840: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that there is output in stderr] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:8 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.097) 0:00:21.724 ***** 26264 1727204257.87582: entering _queue_task() for managed-node3/assert 26264 1727204257.87955: worker is 1 (out of 1 available) 26264 1727204257.87970: exiting _queue_task() for managed-node3/assert 26264 1727204257.87983: done queuing things up, now waiting for results queue to drain 26264 1727204257.87985: waiting for pending results... 26264 1727204257.88737: running TaskExecutor() for managed-node3/TASK: Assert that there is output in stderr 26264 1727204257.89082: in run() - task 0affcd87-79f5-5ff5-08b0-000000000305 26264 1727204257.89101: variable 'ansible_search_path' from source: unknown 26264 1727204257.89109: variable 'ansible_search_path' from source: unknown 26264 1727204257.89810: calling self._execute() 26264 1727204257.89915: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204257.89927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204257.89940: variable 'omit' from source: magic vars 26264 1727204257.90314: variable 'ansible_distribution_major_version' from source: facts 26264 1727204257.90332: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204257.90343: variable 'omit' from source: magic vars 26264 1727204257.90396: variable 'omit' from source: magic vars 26264 1727204257.90446: variable 'omit' from source: magic vars 26264 1727204257.90494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204257.91208: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204257.91238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204257.91262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204257.91282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204257.91314: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204257.91322: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204257.91329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204257.91431: Set connection var ansible_pipelining to False 26264 1727204257.91439: Set connection var ansible_connection to ssh 26264 1727204257.91445: Set connection var ansible_shell_type to sh 26264 1727204257.91455: Set connection var ansible_shell_executable to /bin/sh 26264 1727204257.91470: Set connection var ansible_timeout to 10 26264 1727204257.91481: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204257.91511: variable 'ansible_shell_executable' from source: unknown 26264 1727204257.91518: variable 'ansible_connection' from source: unknown 26264 1727204257.91525: variable 'ansible_module_compression' from source: unknown 26264 1727204257.91530: variable 'ansible_shell_type' from source: unknown 26264 1727204257.91536: variable 'ansible_shell_executable' from source: unknown 26264 1727204257.91542: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204257.91552: variable 'ansible_pipelining' from source: unknown 26264 1727204257.91557: variable 'ansible_timeout' from source: unknown 26264 1727204257.91566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204257.91705: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204257.91721: variable 'omit' from source: magic vars 26264 1727204257.91731: starting attempt loop 26264 1727204257.91737: running the handler 26264 1727204257.91882: variable '__network_connections_result' from source: set_fact 26264 1727204257.91901: Evaluated conditional ('stderr' in __network_connections_result): True 26264 1727204257.91910: handler run complete 26264 1727204257.91927: attempt loop complete, returning result 26264 1727204257.91935: _execute() done 26264 1727204257.91941: dumping result to json 26264 1727204257.91948: done dumping result, returning 26264 1727204257.92579: done running TaskExecutor() for managed-node3/TASK: Assert that there is output in stderr [0affcd87-79f5-5ff5-08b0-000000000305] 26264 1727204257.92590: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000305 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 26264 1727204257.92741: no more pending results, returning what we have 26264 1727204257.92745: results queue empty 26264 1727204257.92745: checking for any_errors_fatal 26264 1727204257.92753: done checking for any_errors_fatal 26264 1727204257.92754: checking for max_fail_percentage 26264 1727204257.92756: done checking for max_fail_percentage 26264 1727204257.92757: checking to see if all hosts have failed and the running result is not ok 26264 1727204257.92758: done checking to see if all hosts have failed 26264 1727204257.92759: getting the remaining hosts for this loop 26264 1727204257.92761: done getting the remaining hosts for this loop 26264 1727204257.92766: getting the next task for host managed-node3 26264 1727204257.92775: done getting next task for host managed-node3 26264 1727204257.92777: ^ task is: TASK: meta (flush_handlers) 26264 1727204257.92778: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204257.92782: getting variables 26264 1727204257.92784: in VariableManager get_vars() 26264 1727204257.92821: Calling all_inventory to load vars for managed-node3 26264 1727204257.92824: Calling groups_inventory to load vars for managed-node3 26264 1727204257.92827: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204257.92839: Calling all_plugins_play to load vars for managed-node3 26264 1727204257.92842: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204257.92845: Calling groups_plugins_play to load vars for managed-node3 26264 1727204257.93877: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000305 26264 1727204257.93882: WORKER PROCESS EXITING 26264 1727204258.02300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204258.04096: done with get_vars() 26264 1727204258.04130: done getting variables 26264 1727204258.04199: in VariableManager get_vars() 26264 1727204258.04214: Calling all_inventory to load vars for managed-node3 26264 1727204258.04216: Calling groups_inventory to load vars for managed-node3 26264 1727204258.04218: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204258.04224: Calling all_plugins_play to load vars for managed-node3 26264 1727204258.04226: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204258.04229: Calling groups_plugins_play to load vars for managed-node3 26264 1727204258.05833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204258.07752: done with get_vars() 26264 1727204258.07785: done queuing things up, now waiting for results queue to drain 26264 1727204258.07787: results queue empty 26264 1727204258.07788: checking for any_errors_fatal 26264 1727204258.07791: done checking for any_errors_fatal 26264 1727204258.07792: checking for max_fail_percentage 26264 1727204258.07793: done checking for max_fail_percentage 26264 1727204258.07794: checking to see if all hosts have failed and the running result is not ok 26264 1727204258.07795: done checking to see if all hosts have failed 26264 1727204258.07796: getting the remaining hosts for this loop 26264 1727204258.07801: done getting the remaining hosts for this loop 26264 1727204258.07804: getting the next task for host managed-node3 26264 1727204258.07808: done getting next task for host managed-node3 26264 1727204258.07809: ^ task is: TASK: meta (flush_handlers) 26264 1727204258.07811: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204258.07813: getting variables 26264 1727204258.07814: in VariableManager get_vars() 26264 1727204258.07826: Calling all_inventory to load vars for managed-node3 26264 1727204258.07828: Calling groups_inventory to load vars for managed-node3 26264 1727204258.07830: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204258.07836: Calling all_plugins_play to load vars for managed-node3 26264 1727204258.07838: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204258.07841: Calling groups_plugins_play to load vars for managed-node3 26264 1727204258.09307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204258.11205: done with get_vars() 26264 1727204258.11233: done getting variables 26264 1727204258.11296: in VariableManager get_vars() 26264 1727204258.11311: Calling all_inventory to load vars for managed-node3 26264 1727204258.11313: Calling groups_inventory to load vars for managed-node3 26264 1727204258.11315: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204258.11320: Calling all_plugins_play to load vars for managed-node3 26264 1727204258.11322: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204258.11325: Calling groups_plugins_play to load vars for managed-node3 26264 1727204258.12785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204258.14636: done with get_vars() 26264 1727204258.14670: done queuing things up, now waiting for results queue to drain 26264 1727204258.14673: results queue empty 26264 1727204258.14674: checking for any_errors_fatal 26264 1727204258.14675: done checking for any_errors_fatal 26264 1727204258.14676: checking for max_fail_percentage 26264 1727204258.14677: done checking for max_fail_percentage 26264 1727204258.14677: checking to see if all hosts have failed and the running result is not ok 26264 1727204258.14678: done checking to see if all hosts have failed 26264 1727204258.14679: getting the remaining hosts for this loop 26264 1727204258.14680: done getting the remaining hosts for this loop 26264 1727204258.14683: getting the next task for host managed-node3 26264 1727204258.14686: done getting next task for host managed-node3 26264 1727204258.14687: ^ task is: None 26264 1727204258.14689: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204258.14690: done queuing things up, now waiting for results queue to drain 26264 1727204258.14691: results queue empty 26264 1727204258.14691: checking for any_errors_fatal 26264 1727204258.14692: done checking for any_errors_fatal 26264 1727204258.14693: checking for max_fail_percentage 26264 1727204258.14694: done checking for max_fail_percentage 26264 1727204258.14694: checking to see if all hosts have failed and the running result is not ok 26264 1727204258.14695: done checking to see if all hosts have failed 26264 1727204258.14696: getting the next task for host managed-node3 26264 1727204258.14698: done getting next task for host managed-node3 26264 1727204258.14699: ^ task is: None 26264 1727204258.14700: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204258.14778: in VariableManager get_vars() 26264 1727204258.14794: done with get_vars() 26264 1727204258.14799: in VariableManager get_vars() 26264 1727204258.14808: done with get_vars() 26264 1727204258.14811: variable 'omit' from source: magic vars 26264 1727204258.14839: in VariableManager get_vars() 26264 1727204258.14850: done with get_vars() 26264 1727204258.14872: variable 'omit' from source: magic vars PLAY [Play for cleaning up the test device and the connection profile] ********* 26264 1727204258.15038: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 26264 1727204258.15671: getting the remaining hosts for this loop 26264 1727204258.15673: done getting the remaining hosts for this loop 26264 1727204258.15676: getting the next task for host managed-node3 26264 1727204258.15679: done getting next task for host managed-node3 26264 1727204258.15681: ^ task is: TASK: Gathering Facts 26264 1727204258.15682: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204258.15684: getting variables 26264 1727204258.15685: in VariableManager get_vars() 26264 1727204258.15694: Calling all_inventory to load vars for managed-node3 26264 1727204258.15696: Calling groups_inventory to load vars for managed-node3 26264 1727204258.15699: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204258.15704: Calling all_plugins_play to load vars for managed-node3 26264 1727204258.15707: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204258.15710: Calling groups_plugins_play to load vars for managed-node3 26264 1727204258.17180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204258.19168: done with get_vars() 26264 1727204258.19193: done getting variables 26264 1727204258.19232: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Tuesday 24 September 2024 14:57:38 -0400 (0:00:00.316) 0:00:22.041 ***** 26264 1727204258.19256: entering _queue_task() for managed-node3/gather_facts 26264 1727204258.19679: worker is 1 (out of 1 available) 26264 1727204258.19691: exiting _queue_task() for managed-node3/gather_facts 26264 1727204258.19707: done queuing things up, now waiting for results queue to drain 26264 1727204258.19709: waiting for pending results... 26264 1727204258.20133: running TaskExecutor() for managed-node3/TASK: Gathering Facts 26264 1727204258.20263: in run() - task 0affcd87-79f5-5ff5-08b0-000000000316 26264 1727204258.20287: variable 'ansible_search_path' from source: unknown 26264 1727204258.20334: calling self._execute() 26264 1727204258.20442: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204258.20456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204258.20475: variable 'omit' from source: magic vars 26264 1727204258.20929: variable 'ansible_distribution_major_version' from source: facts 26264 1727204258.20988: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204258.21083: variable 'omit' from source: magic vars 26264 1727204258.21114: variable 'omit' from source: magic vars 26264 1727204258.21156: variable 'omit' from source: magic vars 26264 1727204258.21285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204258.21330: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204258.21359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204258.21386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204258.21409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204258.21443: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204258.21455: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204258.21467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204258.21581: Set connection var ansible_pipelining to False 26264 1727204258.21589: Set connection var ansible_connection to ssh 26264 1727204258.21596: Set connection var ansible_shell_type to sh 26264 1727204258.21608: Set connection var ansible_shell_executable to /bin/sh 26264 1727204258.21624: Set connection var ansible_timeout to 10 26264 1727204258.21639: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204258.21676: variable 'ansible_shell_executable' from source: unknown 26264 1727204258.21684: variable 'ansible_connection' from source: unknown 26264 1727204258.21691: variable 'ansible_module_compression' from source: unknown 26264 1727204258.21697: variable 'ansible_shell_type' from source: unknown 26264 1727204258.21703: variable 'ansible_shell_executable' from source: unknown 26264 1727204258.21709: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204258.21716: variable 'ansible_pipelining' from source: unknown 26264 1727204258.21722: variable 'ansible_timeout' from source: unknown 26264 1727204258.21734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204258.21923: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204258.21939: variable 'omit' from source: magic vars 26264 1727204258.21956: starting attempt loop 26264 1727204258.21965: running the handler 26264 1727204258.21986: variable 'ansible_facts' from source: unknown 26264 1727204258.22010: _low_level_execute_command(): starting 26264 1727204258.22023: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204258.23598: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204258.23795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204258.23840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204258.23856: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204258.23874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204258.23892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204258.23906: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204258.23920: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204258.23933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204258.23947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204258.23968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204258.23981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204258.23991: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204258.24004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204258.24090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204258.24106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204258.24120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204258.24253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204258.25874: stdout chunk (state=3): >>>/root <<< 26264 1727204258.26087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204258.26090: stdout chunk (state=3): >>><<< 26264 1727204258.26093: stderr chunk (state=3): >>><<< 26264 1727204258.26216: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204258.26220: _low_level_execute_command(): starting 26264 1727204258.26223: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204258.2611668-28259-49656812903438 `" && echo ansible-tmp-1727204258.2611668-28259-49656812903438="` echo /root/.ansible/tmp/ansible-tmp-1727204258.2611668-28259-49656812903438 `" ) && sleep 0' 26264 1727204258.27209: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204258.27241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204258.27258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204258.27285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204258.27330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204258.27343: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204258.27356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204258.27380: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204258.27393: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204258.27405: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204258.27418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204258.27433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204258.27448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204258.27460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204258.27477: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204258.27490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204258.27572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204258.27599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204258.27615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204258.27696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204258.29511: stdout chunk (state=3): >>>ansible-tmp-1727204258.2611668-28259-49656812903438=/root/.ansible/tmp/ansible-tmp-1727204258.2611668-28259-49656812903438 <<< 26264 1727204258.29685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204258.29728: stderr chunk (state=3): >>><<< 26264 1727204258.29732: stdout chunk (state=3): >>><<< 26264 1727204258.29972: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204258.2611668-28259-49656812903438=/root/.ansible/tmp/ansible-tmp-1727204258.2611668-28259-49656812903438 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204258.29976: variable 'ansible_module_compression' from source: unknown 26264 1727204258.29978: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 26264 1727204258.29980: variable 'ansible_facts' from source: unknown 26264 1727204258.30063: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204258.2611668-28259-49656812903438/AnsiballZ_setup.py 26264 1727204258.30721: Sending initial data 26264 1727204258.30725: Sent initial data (153 bytes) 26264 1727204258.33946: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204258.34015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204258.34032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204258.34049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204258.34150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204258.34166: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204258.34184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204258.34205: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204258.34223: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204258.34236: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204258.34334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204258.34350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204258.34369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204258.34383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204258.34396: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204258.34410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204258.34482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204258.34499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204258.34511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204258.34773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204258.36406: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204258.36441: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204258.36483: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpxbedugfp /root/.ansible/tmp/ansible-tmp-1727204258.2611668-28259-49656812903438/AnsiballZ_setup.py <<< 26264 1727204258.36525: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204258.39629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204258.39824: stderr chunk (state=3): >>><<< 26264 1727204258.39829: stdout chunk (state=3): >>><<< 26264 1727204258.39831: done transferring module to remote 26264 1727204258.39833: _low_level_execute_command(): starting 26264 1727204258.39836: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204258.2611668-28259-49656812903438/ /root/.ansible/tmp/ansible-tmp-1727204258.2611668-28259-49656812903438/AnsiballZ_setup.py && sleep 0' 26264 1727204258.40441: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204258.40445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204258.40492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204258.40496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 26264 1727204258.40498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204258.40500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204258.40502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204258.40554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204258.40568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204258.40635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204258.42336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204258.42423: stderr chunk (state=3): >>><<< 26264 1727204258.42427: stdout chunk (state=3): >>><<< 26264 1727204258.42525: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204258.42528: _low_level_execute_command(): starting 26264 1727204258.42532: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204258.2611668-28259-49656812903438/AnsiballZ_setup.py && sleep 0' 26264 1727204258.43618: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204258.43636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204258.43653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204258.43674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204258.43716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204258.43728: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204258.43742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204258.43767: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204258.43781: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204258.43793: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204258.43806: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204258.43821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204258.43838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204258.43850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204258.43869: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204258.43885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204258.43958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204258.43981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204258.43995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204258.44082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204258.96356: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2815, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 717, "free": 2815}, "nocache": {"free": 3274, "used": 258}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_<<< 26264 1727204258.96443: stdout chunk (state=3): >>>vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 604, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264279945216, "block_size": 4096, "block_total": 65519355, "block_available": 64521471, "block_used": 997884, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "38", "epoch": "1727204258", "epoch_int": "1727204258", "date": "2024-09-24", "time": "14:57:38", "iso8601_micro": "2024-09-24T18:57:38.899670Z", "iso8601": "2024-09-24T18:57:38Z", "iso8601_basic": "20240924T145738899670", "iso8601_basic_short": "20240924T145738", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["peerlsr27", "eth0", "lo", "lsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:14:6b:cc:14:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c014:6bff:fecc:144a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "2a:10:ff:43:10:2c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::baf8:bf05:cbb3:11e6", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_<<< 26264 1727204258.96452: stdout chunk (state=3): >>>list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::c014:6bff:fecc:144a", "fe80::8ff:f5ff:fed7:be93", "fe80::baf8:bf05:cbb3:11e6"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93", "fe80::baf8:bf05:cbb3:11e6", "fe80::c014:6bff:fecc:144a"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.5, "5m": 0.39, "15m": 0.2}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 26264 1727204258.98090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204258.98222: stderr chunk (state=3): >>><<< 26264 1727204258.98226: stdout chunk (state=3): >>><<< 26264 1727204258.98377: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2815, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 717, "free": 2815}, "nocache": {"free": 3274, "used": 258}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 604, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264279945216, "block_size": 4096, "block_total": 65519355, "block_available": 64521471, "block_used": 997884, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "38", "epoch": "1727204258", "epoch_int": "1727204258", "date": "2024-09-24", "time": "14:57:38", "iso8601_micro": "2024-09-24T18:57:38.899670Z", "iso8601": "2024-09-24T18:57:38Z", "iso8601_basic": "20240924T145738899670", "iso8601_basic_short": "20240924T145738", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["peerlsr27", "eth0", "lo", "lsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:14:6b:cc:14:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c014:6bff:fecc:144a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "2a:10:ff:43:10:2c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::baf8:bf05:cbb3:11e6", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::c014:6bff:fecc:144a", "fe80::8ff:f5ff:fed7:be93", "fe80::baf8:bf05:cbb3:11e6"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93", "fe80::baf8:bf05:cbb3:11e6", "fe80::c014:6bff:fecc:144a"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.5, "5m": 0.39, "15m": 0.2}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204258.98738: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204258.2611668-28259-49656812903438/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204258.98767: _low_level_execute_command(): starting 26264 1727204258.98779: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204258.2611668-28259-49656812903438/ > /dev/null 2>&1 && sleep 0' 26264 1727204259.00614: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204259.00618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204259.00647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204259.00651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204259.00654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204259.00729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204259.00733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204259.00797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204259.01082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204259.02894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204259.02967: stderr chunk (state=3): >>><<< 26264 1727204259.02971: stdout chunk (state=3): >>><<< 26264 1727204259.03182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204259.03186: handler run complete 26264 1727204259.03188: variable 'ansible_facts' from source: unknown 26264 1727204259.03300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204259.03807: variable 'ansible_facts' from source: unknown 26264 1727204259.03916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204259.04160: attempt loop complete, returning result 26264 1727204259.04173: _execute() done 26264 1727204259.04179: dumping result to json 26264 1727204259.04222: done dumping result, returning 26264 1727204259.04233: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-5ff5-08b0-000000000316] 26264 1727204259.04242: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000316 ok: [managed-node3] 26264 1727204259.05369: no more pending results, returning what we have 26264 1727204259.05373: results queue empty 26264 1727204259.05374: checking for any_errors_fatal 26264 1727204259.05375: done checking for any_errors_fatal 26264 1727204259.05376: checking for max_fail_percentage 26264 1727204259.05377: done checking for max_fail_percentage 26264 1727204259.05378: checking to see if all hosts have failed and the running result is not ok 26264 1727204259.05380: done checking to see if all hosts have failed 26264 1727204259.05381: getting the remaining hosts for this loop 26264 1727204259.05382: done getting the remaining hosts for this loop 26264 1727204259.05387: getting the next task for host managed-node3 26264 1727204259.05392: done getting next task for host managed-node3 26264 1727204259.05394: ^ task is: TASK: meta (flush_handlers) 26264 1727204259.05397: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204259.05400: getting variables 26264 1727204259.05402: in VariableManager get_vars() 26264 1727204259.05427: Calling all_inventory to load vars for managed-node3 26264 1727204259.05430: Calling groups_inventory to load vars for managed-node3 26264 1727204259.05434: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204259.05445: Calling all_plugins_play to load vars for managed-node3 26264 1727204259.05448: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204259.05450: Calling groups_plugins_play to load vars for managed-node3 26264 1727204259.06841: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000316 26264 1727204259.06845: WORKER PROCESS EXITING 26264 1727204259.07313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204259.09509: done with get_vars() 26264 1727204259.09544: done getting variables 26264 1727204259.09619: in VariableManager get_vars() 26264 1727204259.09631: Calling all_inventory to load vars for managed-node3 26264 1727204259.09634: Calling groups_inventory to load vars for managed-node3 26264 1727204259.09636: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204259.09641: Calling all_plugins_play to load vars for managed-node3 26264 1727204259.09644: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204259.09647: Calling groups_plugins_play to load vars for managed-node3 26264 1727204259.11448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204259.13570: done with get_vars() 26264 1727204259.13609: done queuing things up, now waiting for results queue to drain 26264 1727204259.13612: results queue empty 26264 1727204259.13613: checking for any_errors_fatal 26264 1727204259.13618: done checking for any_errors_fatal 26264 1727204259.13619: checking for max_fail_percentage 26264 1727204259.13620: done checking for max_fail_percentage 26264 1727204259.13624: checking to see if all hosts have failed and the running result is not ok 26264 1727204259.13625: done checking to see if all hosts have failed 26264 1727204259.13627: getting the remaining hosts for this loop 26264 1727204259.13628: done getting the remaining hosts for this loop 26264 1727204259.13631: getting the next task for host managed-node3 26264 1727204259.13635: done getting next task for host managed-node3 26264 1727204259.13638: ^ task is: TASK: Show network_provider 26264 1727204259.13639: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204259.13642: getting variables 26264 1727204259.13643: in VariableManager get_vars() 26264 1727204259.13653: Calling all_inventory to load vars for managed-node3 26264 1727204259.13655: Calling groups_inventory to load vars for managed-node3 26264 1727204259.13658: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204259.13667: Calling all_plugins_play to load vars for managed-node3 26264 1727204259.13670: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204259.13674: Calling groups_plugins_play to load vars for managed-node3 26264 1727204259.14945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204259.17020: done with get_vars() 26264 1727204259.17049: done getting variables 26264 1727204259.17098: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:53 Tuesday 24 September 2024 14:57:39 -0400 (0:00:00.978) 0:00:23.020 ***** 26264 1727204259.17128: entering _queue_task() for managed-node3/debug 26264 1727204259.17538: worker is 1 (out of 1 available) 26264 1727204259.17551: exiting _queue_task() for managed-node3/debug 26264 1727204259.17566: done queuing things up, now waiting for results queue to drain 26264 1727204259.17568: waiting for pending results... 26264 1727204259.18979: running TaskExecutor() for managed-node3/TASK: Show network_provider 26264 1727204259.19170: in run() - task 0affcd87-79f5-5ff5-08b0-000000000033 26264 1727204259.19489: variable 'ansible_search_path' from source: unknown 26264 1727204259.19811: calling self._execute() 26264 1727204259.19916: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204259.19980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204259.20184: variable 'omit' from source: magic vars 26264 1727204259.20900: variable 'ansible_distribution_major_version' from source: facts 26264 1727204259.20987: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204259.21001: variable 'omit' from source: magic vars 26264 1727204259.21036: variable 'omit' from source: magic vars 26264 1727204259.21408: variable 'omit' from source: magic vars 26264 1727204259.21582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204259.21630: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204259.21665: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204259.21701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204259.21810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204259.21852: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204259.21862: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204259.21873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204259.21968: Set connection var ansible_pipelining to False 26264 1727204259.21976: Set connection var ansible_connection to ssh 26264 1727204259.21982: Set connection var ansible_shell_type to sh 26264 1727204259.21991: Set connection var ansible_shell_executable to /bin/sh 26264 1727204259.22000: Set connection var ansible_timeout to 10 26264 1727204259.22008: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204259.22035: variable 'ansible_shell_executable' from source: unknown 26264 1727204259.22043: variable 'ansible_connection' from source: unknown 26264 1727204259.22051: variable 'ansible_module_compression' from source: unknown 26264 1727204259.22058: variable 'ansible_shell_type' from source: unknown 26264 1727204259.22063: variable 'ansible_shell_executable' from source: unknown 26264 1727204259.22074: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204259.22081: variable 'ansible_pipelining' from source: unknown 26264 1727204259.22087: variable 'ansible_timeout' from source: unknown 26264 1727204259.22095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204259.22294: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204259.22459: variable 'omit' from source: magic vars 26264 1727204259.22579: starting attempt loop 26264 1727204259.22588: running the handler 26264 1727204259.22712: variable 'network_provider' from source: set_fact 26264 1727204259.22850: variable 'network_provider' from source: set_fact 26264 1727204259.22982: handler run complete 26264 1727204259.23005: attempt loop complete, returning result 26264 1727204259.23014: _execute() done 26264 1727204259.23024: dumping result to json 26264 1727204259.23031: done dumping result, returning 26264 1727204259.23043: done running TaskExecutor() for managed-node3/TASK: Show network_provider [0affcd87-79f5-5ff5-08b0-000000000033] 26264 1727204259.23101: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000033 ok: [managed-node3] => { "network_provider": "nm" } 26264 1727204259.23338: no more pending results, returning what we have 26264 1727204259.23342: results queue empty 26264 1727204259.23343: checking for any_errors_fatal 26264 1727204259.23346: done checking for any_errors_fatal 26264 1727204259.23346: checking for max_fail_percentage 26264 1727204259.23351: done checking for max_fail_percentage 26264 1727204259.23353: checking to see if all hosts have failed and the running result is not ok 26264 1727204259.23354: done checking to see if all hosts have failed 26264 1727204259.23355: getting the remaining hosts for this loop 26264 1727204259.23357: done getting the remaining hosts for this loop 26264 1727204259.23362: getting the next task for host managed-node3 26264 1727204259.23374: done getting next task for host managed-node3 26264 1727204259.23377: ^ task is: TASK: meta (flush_handlers) 26264 1727204259.23379: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204259.23385: getting variables 26264 1727204259.23387: in VariableManager get_vars() 26264 1727204259.23419: Calling all_inventory to load vars for managed-node3 26264 1727204259.23423: Calling groups_inventory to load vars for managed-node3 26264 1727204259.23427: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204259.23441: Calling all_plugins_play to load vars for managed-node3 26264 1727204259.23444: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204259.23447: Calling groups_plugins_play to load vars for managed-node3 26264 1727204259.24609: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000033 26264 1727204259.24613: WORKER PROCESS EXITING 26264 1727204259.27043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204259.29270: done with get_vars() 26264 1727204259.29300: done getting variables 26264 1727204259.29372: in VariableManager get_vars() 26264 1727204259.29383: Calling all_inventory to load vars for managed-node3 26264 1727204259.29385: Calling groups_inventory to load vars for managed-node3 26264 1727204259.29388: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204259.29393: Calling all_plugins_play to load vars for managed-node3 26264 1727204259.29395: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204259.29398: Calling groups_plugins_play to load vars for managed-node3 26264 1727204259.32066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204259.37222: done with get_vars() 26264 1727204259.37268: done queuing things up, now waiting for results queue to drain 26264 1727204259.37271: results queue empty 26264 1727204259.37272: checking for any_errors_fatal 26264 1727204259.37275: done checking for any_errors_fatal 26264 1727204259.37276: checking for max_fail_percentage 26264 1727204259.37277: done checking for max_fail_percentage 26264 1727204259.37278: checking to see if all hosts have failed and the running result is not ok 26264 1727204259.37279: done checking to see if all hosts have failed 26264 1727204259.37279: getting the remaining hosts for this loop 26264 1727204259.37280: done getting the remaining hosts for this loop 26264 1727204259.37284: getting the next task for host managed-node3 26264 1727204259.37294: done getting next task for host managed-node3 26264 1727204259.37296: ^ task is: TASK: meta (flush_handlers) 26264 1727204259.37298: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204259.37301: getting variables 26264 1727204259.37302: in VariableManager get_vars() 26264 1727204259.37313: Calling all_inventory to load vars for managed-node3 26264 1727204259.37316: Calling groups_inventory to load vars for managed-node3 26264 1727204259.37318: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204259.37324: Calling all_plugins_play to load vars for managed-node3 26264 1727204259.37326: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204259.37329: Calling groups_plugins_play to load vars for managed-node3 26264 1727204259.41619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204259.47325: done with get_vars() 26264 1727204259.47362: done getting variables 26264 1727204259.47420: in VariableManager get_vars() 26264 1727204259.47432: Calling all_inventory to load vars for managed-node3 26264 1727204259.47435: Calling groups_inventory to load vars for managed-node3 26264 1727204259.47437: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204259.47443: Calling all_plugins_play to load vars for managed-node3 26264 1727204259.47445: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204259.47448: Calling groups_plugins_play to load vars for managed-node3 26264 1727204259.50713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204259.56565: done with get_vars() 26264 1727204259.56607: done queuing things up, now waiting for results queue to drain 26264 1727204259.56609: results queue empty 26264 1727204259.56610: checking for any_errors_fatal 26264 1727204259.56612: done checking for any_errors_fatal 26264 1727204259.56613: checking for max_fail_percentage 26264 1727204259.56614: done checking for max_fail_percentage 26264 1727204259.56614: checking to see if all hosts have failed and the running result is not ok 26264 1727204259.56615: done checking to see if all hosts have failed 26264 1727204259.56616: getting the remaining hosts for this loop 26264 1727204259.56617: done getting the remaining hosts for this loop 26264 1727204259.56620: getting the next task for host managed-node3 26264 1727204259.56624: done getting next task for host managed-node3 26264 1727204259.56624: ^ task is: None 26264 1727204259.56626: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204259.56627: done queuing things up, now waiting for results queue to drain 26264 1727204259.56628: results queue empty 26264 1727204259.56629: checking for any_errors_fatal 26264 1727204259.56630: done checking for any_errors_fatal 26264 1727204259.56630: checking for max_fail_percentage 26264 1727204259.56631: done checking for max_fail_percentage 26264 1727204259.56632: checking to see if all hosts have failed and the running result is not ok 26264 1727204259.56633: done checking to see if all hosts have failed 26264 1727204259.56634: getting the next task for host managed-node3 26264 1727204259.56636: done getting next task for host managed-node3 26264 1727204259.56637: ^ task is: None 26264 1727204259.56638: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204259.57182: in VariableManager get_vars() 26264 1727204259.57209: done with get_vars() 26264 1727204259.57216: in VariableManager get_vars() 26264 1727204259.57231: done with get_vars() 26264 1727204259.57237: variable 'omit' from source: magic vars 26264 1727204259.57367: variable 'profile' from source: play vars 26264 1727204259.57491: in VariableManager get_vars() 26264 1727204259.57507: done with get_vars() 26264 1727204259.57530: variable 'omit' from source: magic vars 26264 1727204259.57596: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 26264 1727204259.60285: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 26264 1727204259.60972: getting the remaining hosts for this loop 26264 1727204259.60974: done getting the remaining hosts for this loop 26264 1727204259.60978: getting the next task for host managed-node3 26264 1727204259.60981: done getting next task for host managed-node3 26264 1727204259.60983: ^ task is: TASK: Gathering Facts 26264 1727204259.60985: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204259.60987: getting variables 26264 1727204259.60988: in VariableManager get_vars() 26264 1727204259.61003: Calling all_inventory to load vars for managed-node3 26264 1727204259.61006: Calling groups_inventory to load vars for managed-node3 26264 1727204259.61008: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204259.61013: Calling all_plugins_play to load vars for managed-node3 26264 1727204259.61016: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204259.61019: Calling groups_plugins_play to load vars for managed-node3 26264 1727204259.64345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204259.69743: done with get_vars() 26264 1727204259.69779: done getting variables 26264 1727204259.69828: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 14:57:39 -0400 (0:00:00.527) 0:00:23.547 ***** 26264 1727204259.69855: entering _queue_task() for managed-node3/gather_facts 26264 1727204259.70178: worker is 1 (out of 1 available) 26264 1727204259.70188: exiting _queue_task() for managed-node3/gather_facts 26264 1727204259.70201: done queuing things up, now waiting for results queue to drain 26264 1727204259.70203: waiting for pending results... 26264 1727204259.71145: running TaskExecutor() for managed-node3/TASK: Gathering Facts 26264 1727204259.71556: in run() - task 0affcd87-79f5-5ff5-08b0-00000000032b 26264 1727204259.71691: variable 'ansible_search_path' from source: unknown 26264 1727204259.71824: calling self._execute() 26264 1727204259.71953: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204259.72018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204259.72126: variable 'omit' from source: magic vars 26264 1727204259.72907: variable 'ansible_distribution_major_version' from source: facts 26264 1727204259.72925: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204259.72975: variable 'omit' from source: magic vars 26264 1727204259.73070: variable 'omit' from source: magic vars 26264 1727204259.73325: variable 'omit' from source: magic vars 26264 1727204259.73376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204259.73422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204259.73582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204259.73606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204259.73622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204259.73776: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204259.73784: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204259.73791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204259.74018: Set connection var ansible_pipelining to False 26264 1727204259.74030: Set connection var ansible_connection to ssh 26264 1727204259.74037: Set connection var ansible_shell_type to sh 26264 1727204259.74051: Set connection var ansible_shell_executable to /bin/sh 26264 1727204259.74067: Set connection var ansible_timeout to 10 26264 1727204259.74080: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204259.74115: variable 'ansible_shell_executable' from source: unknown 26264 1727204259.74122: variable 'ansible_connection' from source: unknown 26264 1727204259.74128: variable 'ansible_module_compression' from source: unknown 26264 1727204259.74135: variable 'ansible_shell_type' from source: unknown 26264 1727204259.74216: variable 'ansible_shell_executable' from source: unknown 26264 1727204259.74223: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204259.74231: variable 'ansible_pipelining' from source: unknown 26264 1727204259.74237: variable 'ansible_timeout' from source: unknown 26264 1727204259.74245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204259.74669: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204259.74686: variable 'omit' from source: magic vars 26264 1727204259.74696: starting attempt loop 26264 1727204259.74702: running the handler 26264 1727204259.74725: variable 'ansible_facts' from source: unknown 26264 1727204259.74759: _low_level_execute_command(): starting 26264 1727204259.74877: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204259.77225: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204259.77232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204259.77259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 26264 1727204259.77263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204259.77441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204259.77530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204259.77745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204259.79287: stdout chunk (state=3): >>>/root <<< 26264 1727204259.79397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204259.79468: stderr chunk (state=3): >>><<< 26264 1727204259.79503: stdout chunk (state=3): >>><<< 26264 1727204259.79579: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204259.79588: _low_level_execute_command(): starting 26264 1727204259.79616: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204259.7952607-28410-20757572820336 `" && echo ansible-tmp-1727204259.7952607-28410-20757572820336="` echo /root/.ansible/tmp/ansible-tmp-1727204259.7952607-28410-20757572820336 `" ) && sleep 0' 26264 1727204259.81159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204259.81166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204259.81193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204259.81197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204259.81208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204259.81402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204259.81477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204259.81547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204259.83284: stdout chunk (state=3): >>>ansible-tmp-1727204259.7952607-28410-20757572820336=/root/.ansible/tmp/ansible-tmp-1727204259.7952607-28410-20757572820336 <<< 26264 1727204259.83462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204259.83469: stdout chunk (state=3): >>><<< 26264 1727204259.83482: stderr chunk (state=3): >>><<< 26264 1727204259.83784: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204259.7952607-28410-20757572820336=/root/.ansible/tmp/ansible-tmp-1727204259.7952607-28410-20757572820336 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204259.83788: variable 'ansible_module_compression' from source: unknown 26264 1727204259.83790: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 26264 1727204259.83792: variable 'ansible_facts' from source: unknown 26264 1727204259.83833: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204259.7952607-28410-20757572820336/AnsiballZ_setup.py 26264 1727204259.84495: Sending initial data 26264 1727204259.84498: Sent initial data (153 bytes) 26264 1727204259.86950: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204259.87084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204259.87102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204259.87120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204259.87162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204259.87179: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204259.87293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204259.87314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204259.87327: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204259.87338: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204259.87352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204259.87371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204259.87392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204259.87406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204259.87419: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204259.87433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204259.87518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204259.87535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204259.87623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204259.87839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204259.89525: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204259.89564: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204259.89606: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpfzfxpgui /root/.ansible/tmp/ansible-tmp-1727204259.7952607-28410-20757572820336/AnsiballZ_setup.py <<< 26264 1727204259.89643: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204259.92742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204259.92879: stderr chunk (state=3): >>><<< 26264 1727204259.92883: stdout chunk (state=3): >>><<< 26264 1727204259.92885: done transferring module to remote 26264 1727204259.92888: _low_level_execute_command(): starting 26264 1727204259.92890: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204259.7952607-28410-20757572820336/ /root/.ansible/tmp/ansible-tmp-1727204259.7952607-28410-20757572820336/AnsiballZ_setup.py && sleep 0' 26264 1727204259.94354: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204259.94410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204259.94427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204259.94446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204259.94494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204259.94624: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204259.94640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204259.94663: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204259.94679: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204259.94690: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204259.94703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204259.94719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204259.94743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204259.94760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204259.94776: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204259.94791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204259.94875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204259.94893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204259.94960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204259.95177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204259.96882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204259.96981: stderr chunk (state=3): >>><<< 26264 1727204259.96985: stdout chunk (state=3): >>><<< 26264 1727204259.97087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204259.97091: _low_level_execute_command(): starting 26264 1727204259.97093: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204259.7952607-28410-20757572820336/AnsiballZ_setup.py && sleep 0' 26264 1727204259.98643: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204259.98662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204259.98681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204259.98701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204259.98751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204259.98767: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204259.98783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204259.98802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204259.98815: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204259.98832: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204259.98845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204259.98881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204259.98899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204259.98912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204259.98924: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204259.98942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204259.99020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204259.99167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204259.99183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204259.99379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204260.51078: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "peerlsr27", "lsr27", "eth0"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:14:6b:cc:14:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c014:6bff:fecc:144a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "2a:10:ff:43:10:2c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::baf8:bf05:cbb3:11e6", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "o<<< 26264 1727204260.51109: stdout chunk (state=3): >>>n [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::c014:6bff:fecc:144a", "fe80::8ff:f5ff:fed7:be93", "fe80::baf8:bf05:cbb3:11e6"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93", "fe80::baf8:bf05:cbb3:11e6", "fe80::c014:6bff:fecc:144a"]}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.46, "5m": 0.38, "15m": 0.2}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2815, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 717, "free": 2815}, "nocache": {"free": 3275, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 606, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264279977984, "block_size": 4096, "block_total": 65519355, "block_available": 64521479, "block_used": 997876, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "40", "epoch": "1727204260", "epoch_int": "1727204260", "date": "2024-09-24", "time": "14:57:40", "iso8601_micro": "2024-09-24T18:57:40.506207Z", "iso8601": "2024-09-24T18:57:40Z", "iso8601_basic": "20240924T145740506207", "iso8601_basic_short": "20240924T145740", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 26264 1727204260.53082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204260.53086: stdout chunk (state=3): >>><<< 26264 1727204260.53088: stderr chunk (state=3): >>><<< 26264 1727204260.53291: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "peerlsr27", "lsr27", "eth0"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:14:6b:cc:14:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c014:6bff:fecc:144a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "2a:10:ff:43:10:2c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::baf8:bf05:cbb3:11e6", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::c014:6bff:fecc:144a", "fe80::8ff:f5ff:fed7:be93", "fe80::baf8:bf05:cbb3:11e6"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93", "fe80::baf8:bf05:cbb3:11e6", "fe80::c014:6bff:fecc:144a"]}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.46, "5m": 0.38, "15m": 0.2}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2815, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 717, "free": 2815}, "nocache": {"free": 3275, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 606, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264279977984, "block_size": 4096, "block_total": 65519355, "block_available": 64521479, "block_used": 997876, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "40", "epoch": "1727204260", "epoch_int": "1727204260", "date": "2024-09-24", "time": "14:57:40", "iso8601_micro": "2024-09-24T18:57:40.506207Z", "iso8601": "2024-09-24T18:57:40Z", "iso8601_basic": "20240924T145740506207", "iso8601_basic_short": "20240924T145740", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204260.53690: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204259.7952607-28410-20757572820336/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204260.53726: _low_level_execute_command(): starting 26264 1727204260.53739: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204259.7952607-28410-20757572820336/ > /dev/null 2>&1 && sleep 0' 26264 1727204260.54626: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204260.54644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204260.54665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204260.54686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204260.54739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204260.54757: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204260.54774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204260.54792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204260.54804: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204260.54814: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204260.54829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204260.54843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204260.54869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204260.54881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204260.54891: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204260.54905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204260.54997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204260.55012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204260.55026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204260.55160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204260.56999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204260.57003: stdout chunk (state=3): >>><<< 26264 1727204260.57006: stderr chunk (state=3): >>><<< 26264 1727204260.57471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204260.57475: handler run complete 26264 1727204260.57477: variable 'ansible_facts' from source: unknown 26264 1727204260.57479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204260.57706: variable 'ansible_facts' from source: unknown 26264 1727204260.57812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204260.57979: attempt loop complete, returning result 26264 1727204260.57989: _execute() done 26264 1727204260.57995: dumping result to json 26264 1727204260.58043: done dumping result, returning 26264 1727204260.58057: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-5ff5-08b0-00000000032b] 26264 1727204260.58070: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000032b ok: [managed-node3] 26264 1727204260.59187: no more pending results, returning what we have 26264 1727204260.59191: results queue empty 26264 1727204260.59192: checking for any_errors_fatal 26264 1727204260.59194: done checking for any_errors_fatal 26264 1727204260.59194: checking for max_fail_percentage 26264 1727204260.59196: done checking for max_fail_percentage 26264 1727204260.59197: checking to see if all hosts have failed and the running result is not ok 26264 1727204260.59199: done checking to see if all hosts have failed 26264 1727204260.59199: getting the remaining hosts for this loop 26264 1727204260.59201: done getting the remaining hosts for this loop 26264 1727204260.59206: getting the next task for host managed-node3 26264 1727204260.59212: done getting next task for host managed-node3 26264 1727204260.59214: ^ task is: TASK: meta (flush_handlers) 26264 1727204260.59216: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204260.59220: getting variables 26264 1727204260.59222: in VariableManager get_vars() 26264 1727204260.59259: Calling all_inventory to load vars for managed-node3 26264 1727204260.59262: Calling groups_inventory to load vars for managed-node3 26264 1727204260.59266: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204260.59280: Calling all_plugins_play to load vars for managed-node3 26264 1727204260.59283: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204260.59286: Calling groups_plugins_play to load vars for managed-node3 26264 1727204260.60833: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000032b 26264 1727204260.60836: WORKER PROCESS EXITING 26264 1727204260.61741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204260.63458: done with get_vars() 26264 1727204260.63485: done getting variables 26264 1727204260.63557: in VariableManager get_vars() 26264 1727204260.63572: Calling all_inventory to load vars for managed-node3 26264 1727204260.63575: Calling groups_inventory to load vars for managed-node3 26264 1727204260.63577: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204260.63582: Calling all_plugins_play to load vars for managed-node3 26264 1727204260.63584: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204260.63592: Calling groups_plugins_play to load vars for managed-node3 26264 1727204260.64876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204260.66737: done with get_vars() 26264 1727204260.66776: done queuing things up, now waiting for results queue to drain 26264 1727204260.66778: results queue empty 26264 1727204260.66779: checking for any_errors_fatal 26264 1727204260.66784: done checking for any_errors_fatal 26264 1727204260.66785: checking for max_fail_percentage 26264 1727204260.66786: done checking for max_fail_percentage 26264 1727204260.66786: checking to see if all hosts have failed and the running result is not ok 26264 1727204260.66787: done checking to see if all hosts have failed 26264 1727204260.66788: getting the remaining hosts for this loop 26264 1727204260.66789: done getting the remaining hosts for this loop 26264 1727204260.66792: getting the next task for host managed-node3 26264 1727204260.66797: done getting next task for host managed-node3 26264 1727204260.66800: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 26264 1727204260.66802: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204260.66812: getting variables 26264 1727204260.66813: in VariableManager get_vars() 26264 1727204260.66829: Calling all_inventory to load vars for managed-node3 26264 1727204260.66832: Calling groups_inventory to load vars for managed-node3 26264 1727204260.66834: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204260.66839: Calling all_plugins_play to load vars for managed-node3 26264 1727204260.66842: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204260.66844: Calling groups_plugins_play to load vars for managed-node3 26264 1727204260.68126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204260.69851: done with get_vars() 26264 1727204260.69880: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:40 -0400 (0:00:01.001) 0:00:24.548 ***** 26264 1727204260.69977: entering _queue_task() for managed-node3/include_tasks 26264 1727204260.70314: worker is 1 (out of 1 available) 26264 1727204260.70328: exiting _queue_task() for managed-node3/include_tasks 26264 1727204260.70342: done queuing things up, now waiting for results queue to drain 26264 1727204260.70344: waiting for pending results... 26264 1727204260.70529: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 26264 1727204260.70605: in run() - task 0affcd87-79f5-5ff5-08b0-00000000003c 26264 1727204260.70618: variable 'ansible_search_path' from source: unknown 26264 1727204260.70622: variable 'ansible_search_path' from source: unknown 26264 1727204260.70653: calling self._execute() 26264 1727204260.70722: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204260.70726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204260.70735: variable 'omit' from source: magic vars 26264 1727204260.71022: variable 'ansible_distribution_major_version' from source: facts 26264 1727204260.71033: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204260.71039: _execute() done 26264 1727204260.71042: dumping result to json 26264 1727204260.71046: done dumping result, returning 26264 1727204260.71053: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-5ff5-08b0-00000000003c] 26264 1727204260.71059: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000003c 26264 1727204260.71143: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000003c 26264 1727204260.71147: WORKER PROCESS EXITING 26264 1727204260.71194: no more pending results, returning what we have 26264 1727204260.71199: in VariableManager get_vars() 26264 1727204260.71241: Calling all_inventory to load vars for managed-node3 26264 1727204260.71244: Calling groups_inventory to load vars for managed-node3 26264 1727204260.71246: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204260.71269: Calling all_plugins_play to load vars for managed-node3 26264 1727204260.71273: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204260.71277: Calling groups_plugins_play to load vars for managed-node3 26264 1727204260.72406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204260.73608: done with get_vars() 26264 1727204260.73623: variable 'ansible_search_path' from source: unknown 26264 1727204260.73624: variable 'ansible_search_path' from source: unknown 26264 1727204260.73651: we have included files to process 26264 1727204260.73653: generating all_blocks data 26264 1727204260.73654: done generating all_blocks data 26264 1727204260.73654: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26264 1727204260.73655: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26264 1727204260.73657: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26264 1727204260.74053: done processing included file 26264 1727204260.74054: iterating over new_blocks loaded from include file 26264 1727204260.74055: in VariableManager get_vars() 26264 1727204260.74071: done with get_vars() 26264 1727204260.74072: filtering new block on tags 26264 1727204260.74082: done filtering new block on tags 26264 1727204260.74084: in VariableManager get_vars() 26264 1727204260.74097: done with get_vars() 26264 1727204260.74099: filtering new block on tags 26264 1727204260.74111: done filtering new block on tags 26264 1727204260.74112: in VariableManager get_vars() 26264 1727204260.74124: done with get_vars() 26264 1727204260.74125: filtering new block on tags 26264 1727204260.74134: done filtering new block on tags 26264 1727204260.74135: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 26264 1727204260.74139: extending task lists for all hosts with included blocks 26264 1727204260.74358: done extending task lists 26264 1727204260.74359: done processing included files 26264 1727204260.74360: results queue empty 26264 1727204260.74361: checking for any_errors_fatal 26264 1727204260.74362: done checking for any_errors_fatal 26264 1727204260.74362: checking for max_fail_percentage 26264 1727204260.74363: done checking for max_fail_percentage 26264 1727204260.74365: checking to see if all hosts have failed and the running result is not ok 26264 1727204260.74366: done checking to see if all hosts have failed 26264 1727204260.74366: getting the remaining hosts for this loop 26264 1727204260.74367: done getting the remaining hosts for this loop 26264 1727204260.74369: getting the next task for host managed-node3 26264 1727204260.74371: done getting next task for host managed-node3 26264 1727204260.74373: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 26264 1727204260.74375: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204260.74382: getting variables 26264 1727204260.74382: in VariableManager get_vars() 26264 1727204260.74391: Calling all_inventory to load vars for managed-node3 26264 1727204260.74393: Calling groups_inventory to load vars for managed-node3 26264 1727204260.74394: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204260.74398: Calling all_plugins_play to load vars for managed-node3 26264 1727204260.74399: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204260.74401: Calling groups_plugins_play to load vars for managed-node3 26264 1727204260.75270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204260.76969: done with get_vars() 26264 1727204260.76996: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:57:40 -0400 (0:00:00.071) 0:00:24.619 ***** 26264 1727204260.77082: entering _queue_task() for managed-node3/setup 26264 1727204260.77429: worker is 1 (out of 1 available) 26264 1727204260.77443: exiting _queue_task() for managed-node3/setup 26264 1727204260.77457: done queuing things up, now waiting for results queue to drain 26264 1727204260.77459: waiting for pending results... 26264 1727204260.77745: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 26264 1727204260.77903: in run() - task 0affcd87-79f5-5ff5-08b0-00000000036c 26264 1727204260.77925: variable 'ansible_search_path' from source: unknown 26264 1727204260.77932: variable 'ansible_search_path' from source: unknown 26264 1727204260.77976: calling self._execute() 26264 1727204260.78088: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204260.78102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204260.78120: variable 'omit' from source: magic vars 26264 1727204260.78504: variable 'ansible_distribution_major_version' from source: facts 26264 1727204260.78523: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204260.78757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204260.86653: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204260.86727: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204260.86772: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204260.86813: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204260.86845: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204260.86937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204260.86976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204260.87009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204260.87062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204260.87086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204260.87147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204260.87182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204260.87213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204260.87268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204260.87289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204260.87452: variable '__network_required_facts' from source: role '' defaults 26264 1727204260.87472: variable 'ansible_facts' from source: unknown 26264 1727204260.88156: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 26264 1727204260.88167: when evaluation is False, skipping this task 26264 1727204260.88175: _execute() done 26264 1727204260.88181: dumping result to json 26264 1727204260.88188: done dumping result, returning 26264 1727204260.88199: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-5ff5-08b0-00000000036c] 26264 1727204260.88208: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000036c 26264 1727204260.88311: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000036c 26264 1727204260.88320: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204260.88363: no more pending results, returning what we have 26264 1727204260.88368: results queue empty 26264 1727204260.88369: checking for any_errors_fatal 26264 1727204260.88371: done checking for any_errors_fatal 26264 1727204260.88371: checking for max_fail_percentage 26264 1727204260.88373: done checking for max_fail_percentage 26264 1727204260.88373: checking to see if all hosts have failed and the running result is not ok 26264 1727204260.88374: done checking to see if all hosts have failed 26264 1727204260.88375: getting the remaining hosts for this loop 26264 1727204260.88377: done getting the remaining hosts for this loop 26264 1727204260.88381: getting the next task for host managed-node3 26264 1727204260.88388: done getting next task for host managed-node3 26264 1727204260.88392: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 26264 1727204260.88395: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204260.88408: getting variables 26264 1727204260.88410: in VariableManager get_vars() 26264 1727204260.88452: Calling all_inventory to load vars for managed-node3 26264 1727204260.88456: Calling groups_inventory to load vars for managed-node3 26264 1727204260.88458: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204260.88471: Calling all_plugins_play to load vars for managed-node3 26264 1727204260.88474: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204260.88477: Calling groups_plugins_play to load vars for managed-node3 26264 1727204260.94372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204260.95482: done with get_vars() 26264 1727204260.95508: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:57:40 -0400 (0:00:00.185) 0:00:24.805 ***** 26264 1727204260.95602: entering _queue_task() for managed-node3/stat 26264 1727204260.95933: worker is 1 (out of 1 available) 26264 1727204260.95944: exiting _queue_task() for managed-node3/stat 26264 1727204260.95959: done queuing things up, now waiting for results queue to drain 26264 1727204260.95961: waiting for pending results... 26264 1727204260.96277: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 26264 1727204260.96434: in run() - task 0affcd87-79f5-5ff5-08b0-00000000036e 26264 1727204260.96438: variable 'ansible_search_path' from source: unknown 26264 1727204260.96441: variable 'ansible_search_path' from source: unknown 26264 1727204260.96444: calling self._execute() 26264 1727204260.96504: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204260.96508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204260.96516: variable 'omit' from source: magic vars 26264 1727204260.96801: variable 'ansible_distribution_major_version' from source: facts 26264 1727204260.96812: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204260.96933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204260.97135: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204260.97173: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204260.97218: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204260.97246: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204260.97315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204260.97333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204260.97351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204260.97372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204260.97437: variable '__network_is_ostree' from source: set_fact 26264 1727204260.97443: Evaluated conditional (not __network_is_ostree is defined): False 26264 1727204260.97446: when evaluation is False, skipping this task 26264 1727204260.97448: _execute() done 26264 1727204260.97454: dumping result to json 26264 1727204260.97457: done dumping result, returning 26264 1727204260.97466: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-5ff5-08b0-00000000036e] 26264 1727204260.97471: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000036e 26264 1727204260.97557: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000036e 26264 1727204260.97560: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 26264 1727204260.97614: no more pending results, returning what we have 26264 1727204260.97618: results queue empty 26264 1727204260.97618: checking for any_errors_fatal 26264 1727204260.97625: done checking for any_errors_fatal 26264 1727204260.97626: checking for max_fail_percentage 26264 1727204260.97629: done checking for max_fail_percentage 26264 1727204260.97630: checking to see if all hosts have failed and the running result is not ok 26264 1727204260.97631: done checking to see if all hosts have failed 26264 1727204260.97631: getting the remaining hosts for this loop 26264 1727204260.97633: done getting the remaining hosts for this loop 26264 1727204260.97637: getting the next task for host managed-node3 26264 1727204260.97643: done getting next task for host managed-node3 26264 1727204260.97647: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 26264 1727204260.97650: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204260.97665: getting variables 26264 1727204260.97667: in VariableManager get_vars() 26264 1727204260.97704: Calling all_inventory to load vars for managed-node3 26264 1727204260.97707: Calling groups_inventory to load vars for managed-node3 26264 1727204260.97709: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204260.97717: Calling all_plugins_play to load vars for managed-node3 26264 1727204260.97720: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204260.97722: Calling groups_plugins_play to load vars for managed-node3 26264 1727204260.98538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204260.99590: done with get_vars() 26264 1727204260.99605: done getting variables 26264 1727204260.99652: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:57:40 -0400 (0:00:00.040) 0:00:24.845 ***** 26264 1727204260.99681: entering _queue_task() for managed-node3/set_fact 26264 1727204260.99919: worker is 1 (out of 1 available) 26264 1727204260.99932: exiting _queue_task() for managed-node3/set_fact 26264 1727204260.99945: done queuing things up, now waiting for results queue to drain 26264 1727204260.99947: waiting for pending results... 26264 1727204261.00133: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 26264 1727204261.00238: in run() - task 0affcd87-79f5-5ff5-08b0-00000000036f 26264 1727204261.00251: variable 'ansible_search_path' from source: unknown 26264 1727204261.00256: variable 'ansible_search_path' from source: unknown 26264 1727204261.00285: calling self._execute() 26264 1727204261.00355: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204261.00359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204261.00367: variable 'omit' from source: magic vars 26264 1727204261.00796: variable 'ansible_distribution_major_version' from source: facts 26264 1727204261.00817: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204261.01018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204261.01324: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204261.01380: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204261.01436: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204261.01533: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204261.01637: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204261.01673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204261.01702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204261.01746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204261.01861: variable '__network_is_ostree' from source: set_fact 26264 1727204261.01877: Evaluated conditional (not __network_is_ostree is defined): False 26264 1727204261.01887: when evaluation is False, skipping this task 26264 1727204261.01894: _execute() done 26264 1727204261.01903: dumping result to json 26264 1727204261.01911: done dumping result, returning 26264 1727204261.01925: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-5ff5-08b0-00000000036f] 26264 1727204261.01936: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000036f 26264 1727204261.02060: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000036f 26264 1727204261.02063: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 26264 1727204261.02119: no more pending results, returning what we have 26264 1727204261.02123: results queue empty 26264 1727204261.02124: checking for any_errors_fatal 26264 1727204261.02131: done checking for any_errors_fatal 26264 1727204261.02132: checking for max_fail_percentage 26264 1727204261.02133: done checking for max_fail_percentage 26264 1727204261.02134: checking to see if all hosts have failed and the running result is not ok 26264 1727204261.02135: done checking to see if all hosts have failed 26264 1727204261.02136: getting the remaining hosts for this loop 26264 1727204261.02138: done getting the remaining hosts for this loop 26264 1727204261.02142: getting the next task for host managed-node3 26264 1727204261.02153: done getting next task for host managed-node3 26264 1727204261.02157: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 26264 1727204261.02160: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204261.02199: getting variables 26264 1727204261.02201: in VariableManager get_vars() 26264 1727204261.02258: Calling all_inventory to load vars for managed-node3 26264 1727204261.02262: Calling groups_inventory to load vars for managed-node3 26264 1727204261.02265: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204261.02275: Calling all_plugins_play to load vars for managed-node3 26264 1727204261.02277: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204261.02279: Calling groups_plugins_play to load vars for managed-node3 26264 1727204261.03133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204261.04191: done with get_vars() 26264 1727204261.04213: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.046) 0:00:24.892 ***** 26264 1727204261.04317: entering _queue_task() for managed-node3/service_facts 26264 1727204261.04637: worker is 1 (out of 1 available) 26264 1727204261.04657: exiting _queue_task() for managed-node3/service_facts 26264 1727204261.04672: done queuing things up, now waiting for results queue to drain 26264 1727204261.04674: waiting for pending results... 26264 1727204261.04953: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 26264 1727204261.05089: in run() - task 0affcd87-79f5-5ff5-08b0-000000000371 26264 1727204261.05115: variable 'ansible_search_path' from source: unknown 26264 1727204261.05118: variable 'ansible_search_path' from source: unknown 26264 1727204261.05144: calling self._execute() 26264 1727204261.05235: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204261.05246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204261.05261: variable 'omit' from source: magic vars 26264 1727204261.05637: variable 'ansible_distribution_major_version' from source: facts 26264 1727204261.05661: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204261.05678: variable 'omit' from source: magic vars 26264 1727204261.05739: variable 'omit' from source: magic vars 26264 1727204261.05782: variable 'omit' from source: magic vars 26264 1727204261.05828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204261.05875: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204261.05899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204261.05921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204261.05937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204261.05974: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204261.05982: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204261.05989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204261.06093: Set connection var ansible_pipelining to False 26264 1727204261.06103: Set connection var ansible_connection to ssh 26264 1727204261.06110: Set connection var ansible_shell_type to sh 26264 1727204261.06121: Set connection var ansible_shell_executable to /bin/sh 26264 1727204261.06134: Set connection var ansible_timeout to 10 26264 1727204261.06146: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204261.06183: variable 'ansible_shell_executable' from source: unknown 26264 1727204261.06192: variable 'ansible_connection' from source: unknown 26264 1727204261.06200: variable 'ansible_module_compression' from source: unknown 26264 1727204261.06207: variable 'ansible_shell_type' from source: unknown 26264 1727204261.06215: variable 'ansible_shell_executable' from source: unknown 26264 1727204261.06222: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204261.06230: variable 'ansible_pipelining' from source: unknown 26264 1727204261.06237: variable 'ansible_timeout' from source: unknown 26264 1727204261.06245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204261.06450: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204261.06466: variable 'omit' from source: magic vars 26264 1727204261.06470: starting attempt loop 26264 1727204261.06473: running the handler 26264 1727204261.06483: _low_level_execute_command(): starting 26264 1727204261.06489: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204261.07007: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204261.07025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204261.07041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204261.07055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204261.07068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204261.07118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204261.07133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204261.07191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204261.08834: stdout chunk (state=3): >>>/root <<< 26264 1727204261.08946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204261.08993: stderr chunk (state=3): >>><<< 26264 1727204261.08996: stdout chunk (state=3): >>><<< 26264 1727204261.09018: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204261.09028: _low_level_execute_command(): starting 26264 1727204261.09034: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204261.0901651-28460-188919324933216 `" && echo ansible-tmp-1727204261.0901651-28460-188919324933216="` echo /root/.ansible/tmp/ansible-tmp-1727204261.0901651-28460-188919324933216 `" ) && sleep 0' 26264 1727204261.09496: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204261.09502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204261.09528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204261.09552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204261.09603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204261.09615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204261.09672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204261.11486: stdout chunk (state=3): >>>ansible-tmp-1727204261.0901651-28460-188919324933216=/root/.ansible/tmp/ansible-tmp-1727204261.0901651-28460-188919324933216 <<< 26264 1727204261.11602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204261.11658: stderr chunk (state=3): >>><<< 26264 1727204261.11662: stdout chunk (state=3): >>><<< 26264 1727204261.11679: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204261.0901651-28460-188919324933216=/root/.ansible/tmp/ansible-tmp-1727204261.0901651-28460-188919324933216 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204261.11722: variable 'ansible_module_compression' from source: unknown 26264 1727204261.11761: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 26264 1727204261.11796: variable 'ansible_facts' from source: unknown 26264 1727204261.11852: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204261.0901651-28460-188919324933216/AnsiballZ_service_facts.py 26264 1727204261.11958: Sending initial data 26264 1727204261.11961: Sent initial data (162 bytes) 26264 1727204261.12665: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204261.12679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204261.12704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204261.12722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204261.12767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204261.12780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204261.12827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204261.14502: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204261.14537: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204261.14575: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpy4b0kheo /root/.ansible/tmp/ansible-tmp-1727204261.0901651-28460-188919324933216/AnsiballZ_service_facts.py <<< 26264 1727204261.14610: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204261.15421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204261.15536: stderr chunk (state=3): >>><<< 26264 1727204261.15539: stdout chunk (state=3): >>><<< 26264 1727204261.15558: done transferring module to remote 26264 1727204261.15569: _low_level_execute_command(): starting 26264 1727204261.15574: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204261.0901651-28460-188919324933216/ /root/.ansible/tmp/ansible-tmp-1727204261.0901651-28460-188919324933216/AnsiballZ_service_facts.py && sleep 0' 26264 1727204261.16060: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204261.16065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204261.16099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204261.16102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204261.16104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204261.16163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204261.16172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204261.16173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204261.16208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204261.17914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204261.17982: stderr chunk (state=3): >>><<< 26264 1727204261.17985: stdout chunk (state=3): >>><<< 26264 1727204261.17999: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204261.18003: _low_level_execute_command(): starting 26264 1727204261.18010: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204261.0901651-28460-188919324933216/AnsiballZ_service_facts.py && sleep 0' 26264 1727204261.18485: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204261.18489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204261.18526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204261.18529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204261.18532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204261.18593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204261.18596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204261.18603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204261.18647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204262.45020: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 26264 1727204262.45069: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "stat<<< 26264 1727204262.45075: stdout chunk (state=3): >>>e": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syst<<< 26264 1727204262.45080: stdout chunk (state=3): >>>emd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.s<<< 26264 1727204262.45108: stdout chunk (state=3): >>>ervice": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 26264 1727204262.46389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204262.46454: stderr chunk (state=3): >>><<< 26264 1727204262.46458: stdout chunk (state=3): >>><<< 26264 1727204262.46931: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204262.47853: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204261.0901651-28460-188919324933216/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204262.47894: _low_level_execute_command(): starting 26264 1727204262.47965: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204261.0901651-28460-188919324933216/ > /dev/null 2>&1 && sleep 0' 26264 1727204262.48732: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204262.48736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204262.48777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204262.48780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204262.48783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204262.48839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204262.48842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204262.48848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204262.48892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204262.50645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204262.50731: stderr chunk (state=3): >>><<< 26264 1727204262.50735: stdout chunk (state=3): >>><<< 26264 1727204262.50772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204262.50775: handler run complete 26264 1727204262.50977: variable 'ansible_facts' from source: unknown 26264 1727204262.51578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204262.51959: variable 'ansible_facts' from source: unknown 26264 1727204262.52075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204262.52214: attempt loop complete, returning result 26264 1727204262.52221: _execute() done 26264 1727204262.52227: dumping result to json 26264 1727204262.52263: done dumping result, returning 26264 1727204262.52273: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-5ff5-08b0-000000000371] 26264 1727204262.52278: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000371 26264 1727204262.53126: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000371 26264 1727204262.53129: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204262.53197: no more pending results, returning what we have 26264 1727204262.53200: results queue empty 26264 1727204262.53200: checking for any_errors_fatal 26264 1727204262.53202: done checking for any_errors_fatal 26264 1727204262.53203: checking for max_fail_percentage 26264 1727204262.53204: done checking for max_fail_percentage 26264 1727204262.53204: checking to see if all hosts have failed and the running result is not ok 26264 1727204262.53205: done checking to see if all hosts have failed 26264 1727204262.53206: getting the remaining hosts for this loop 26264 1727204262.53206: done getting the remaining hosts for this loop 26264 1727204262.53209: getting the next task for host managed-node3 26264 1727204262.53213: done getting next task for host managed-node3 26264 1727204262.53215: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 26264 1727204262.53217: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204262.53222: getting variables 26264 1727204262.53223: in VariableManager get_vars() 26264 1727204262.53251: Calling all_inventory to load vars for managed-node3 26264 1727204262.53253: Calling groups_inventory to load vars for managed-node3 26264 1727204262.53255: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204262.53262: Calling all_plugins_play to load vars for managed-node3 26264 1727204262.53271: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204262.53276: Calling groups_plugins_play to load vars for managed-node3 26264 1727204262.54290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204262.55458: done with get_vars() 26264 1727204262.55479: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:57:42 -0400 (0:00:01.512) 0:00:26.404 ***** 26264 1727204262.55566: entering _queue_task() for managed-node3/package_facts 26264 1727204262.55837: worker is 1 (out of 1 available) 26264 1727204262.55853: exiting _queue_task() for managed-node3/package_facts 26264 1727204262.55868: done queuing things up, now waiting for results queue to drain 26264 1727204262.55871: waiting for pending results... 26264 1727204262.56079: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 26264 1727204262.56212: in run() - task 0affcd87-79f5-5ff5-08b0-000000000372 26264 1727204262.56236: variable 'ansible_search_path' from source: unknown 26264 1727204262.56240: variable 'ansible_search_path' from source: unknown 26264 1727204262.56281: calling self._execute() 26264 1727204262.56351: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204262.56358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204262.56367: variable 'omit' from source: magic vars 26264 1727204262.56651: variable 'ansible_distribution_major_version' from source: facts 26264 1727204262.56665: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204262.56675: variable 'omit' from source: magic vars 26264 1727204262.56714: variable 'omit' from source: magic vars 26264 1727204262.56738: variable 'omit' from source: magic vars 26264 1727204262.56772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204262.56806: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204262.56825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204262.56839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204262.56848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204262.56877: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204262.56881: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204262.56884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204262.56957: Set connection var ansible_pipelining to False 26264 1727204262.56961: Set connection var ansible_connection to ssh 26264 1727204262.56996: Set connection var ansible_shell_type to sh 26264 1727204262.56999: Set connection var ansible_shell_executable to /bin/sh 26264 1727204262.57001: Set connection var ansible_timeout to 10 26264 1727204262.57004: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204262.57016: variable 'ansible_shell_executable' from source: unknown 26264 1727204262.57018: variable 'ansible_connection' from source: unknown 26264 1727204262.57022: variable 'ansible_module_compression' from source: unknown 26264 1727204262.57026: variable 'ansible_shell_type' from source: unknown 26264 1727204262.57029: variable 'ansible_shell_executable' from source: unknown 26264 1727204262.57031: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204262.57034: variable 'ansible_pipelining' from source: unknown 26264 1727204262.57036: variable 'ansible_timeout' from source: unknown 26264 1727204262.57038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204262.57188: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204262.57197: variable 'omit' from source: magic vars 26264 1727204262.57202: starting attempt loop 26264 1727204262.57204: running the handler 26264 1727204262.57215: _low_level_execute_command(): starting 26264 1727204262.57223: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204262.57794: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204262.57811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204262.57826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204262.57842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204262.57926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204262.57942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204262.57968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204262.59528: stdout chunk (state=3): >>>/root <<< 26264 1727204262.59637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204262.59690: stderr chunk (state=3): >>><<< 26264 1727204262.59693: stdout chunk (state=3): >>><<< 26264 1727204262.59715: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204262.59727: _low_level_execute_command(): starting 26264 1727204262.59733: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204262.5971546-28530-16140284762288 `" && echo ansible-tmp-1727204262.5971546-28530-16140284762288="` echo /root/.ansible/tmp/ansible-tmp-1727204262.5971546-28530-16140284762288 `" ) && sleep 0' 26264 1727204262.60369: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204262.60388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204262.60407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204262.60426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204262.60468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204262.60488: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204262.60504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204262.60508: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204262.60516: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204262.60528: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204262.60541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204262.60572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204262.60576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204262.60578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204262.60637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204262.60655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204262.60678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204262.60755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204262.62575: stdout chunk (state=3): >>>ansible-tmp-1727204262.5971546-28530-16140284762288=/root/.ansible/tmp/ansible-tmp-1727204262.5971546-28530-16140284762288 <<< 26264 1727204262.62710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204262.62799: stderr chunk (state=3): >>><<< 26264 1727204262.62802: stdout chunk (state=3): >>><<< 26264 1727204262.62840: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204262.5971546-28530-16140284762288=/root/.ansible/tmp/ansible-tmp-1727204262.5971546-28530-16140284762288 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204262.62884: variable 'ansible_module_compression' from source: unknown 26264 1727204262.62937: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 26264 1727204262.63003: variable 'ansible_facts' from source: unknown 26264 1727204262.63183: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204262.5971546-28530-16140284762288/AnsiballZ_package_facts.py 26264 1727204262.63353: Sending initial data 26264 1727204262.63356: Sent initial data (161 bytes) 26264 1727204262.64152: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204262.64203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204262.64208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204262.64213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204262.64277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204262.64290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204262.64326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204262.65988: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204262.66028: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204262.66068: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmp_fszeocu /root/.ansible/tmp/ansible-tmp-1727204262.5971546-28530-16140284762288/AnsiballZ_package_facts.py <<< 26264 1727204262.66105: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204262.68152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204262.68576: stderr chunk (state=3): >>><<< 26264 1727204262.68584: stdout chunk (state=3): >>><<< 26264 1727204262.68589: done transferring module to remote 26264 1727204262.68591: _low_level_execute_command(): starting 26264 1727204262.68596: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204262.5971546-28530-16140284762288/ /root/.ansible/tmp/ansible-tmp-1727204262.5971546-28530-16140284762288/AnsiballZ_package_facts.py && sleep 0' 26264 1727204262.69589: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204262.69621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204262.69652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204262.69692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204262.69753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204262.69774: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204262.69793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204262.69823: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204262.69838: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204262.69857: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204262.69876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204262.69904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204262.69917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204262.69931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204262.69951: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204262.69961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204262.70071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204262.70088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204262.70093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204262.70238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204262.71862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204262.72029: stderr chunk (state=3): >>><<< 26264 1727204262.72049: stdout chunk (state=3): >>><<< 26264 1727204262.72245: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204262.72248: _low_level_execute_command(): starting 26264 1727204262.72255: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204262.5971546-28530-16140284762288/AnsiballZ_package_facts.py && sleep 0' 26264 1727204262.73151: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204262.73178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204262.73206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204262.73246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204262.73315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204262.73345: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204262.73387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204262.73420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204262.73443: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204262.73471: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204262.73494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204262.73522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204262.73539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204262.73551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204262.73561: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204262.73594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204262.73673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204262.73702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204262.73717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204262.73797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204263.19602: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version<<< 26264 1727204263.19732: stdout chunk (state=3): >>>": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"na<<< 26264 1727204263.19780: stdout chunk (state=3): >>>me": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 26264 1727204263.19815: stdout chunk (state=3): >>>}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 26264 1727204263.21284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204263.21289: stdout chunk (state=3): >>><<< 26264 1727204263.21326: stderr chunk (state=3): >>><<< 26264 1727204263.21461: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204263.26657: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204262.5971546-28530-16140284762288/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204263.26696: _low_level_execute_command(): starting 26264 1727204263.26712: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204262.5971546-28530-16140284762288/ > /dev/null 2>&1 && sleep 0' 26264 1727204263.27993: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204263.28007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204263.28021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204263.28053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204263.28101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204263.28113: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204263.28126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204263.28189: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204263.28201: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204263.28212: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204263.28223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204263.28236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204263.28256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204263.28275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204263.28287: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204263.28299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204263.28510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204263.28534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204263.28555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204263.28633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204263.30499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204263.30598: stderr chunk (state=3): >>><<< 26264 1727204263.30602: stdout chunk (state=3): >>><<< 26264 1727204263.30890: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204263.30893: handler run complete 26264 1727204263.32192: variable 'ansible_facts' from source: unknown 26264 1727204263.33536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204263.40372: variable 'ansible_facts' from source: unknown 26264 1727204263.40969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204263.42273: attempt loop complete, returning result 26264 1727204263.42417: _execute() done 26264 1727204263.42427: dumping result to json 26264 1727204263.42917: done dumping result, returning 26264 1727204263.42976: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-5ff5-08b0-000000000372] 26264 1727204263.42987: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000372 26264 1727204263.48527: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000372 26264 1727204263.48530: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204263.48699: no more pending results, returning what we have 26264 1727204263.48703: results queue empty 26264 1727204263.48703: checking for any_errors_fatal 26264 1727204263.48709: done checking for any_errors_fatal 26264 1727204263.48710: checking for max_fail_percentage 26264 1727204263.48711: done checking for max_fail_percentage 26264 1727204263.48712: checking to see if all hosts have failed and the running result is not ok 26264 1727204263.48713: done checking to see if all hosts have failed 26264 1727204263.48714: getting the remaining hosts for this loop 26264 1727204263.48716: done getting the remaining hosts for this loop 26264 1727204263.48719: getting the next task for host managed-node3 26264 1727204263.48726: done getting next task for host managed-node3 26264 1727204263.48730: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 26264 1727204263.48732: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204263.48742: getting variables 26264 1727204263.48744: in VariableManager get_vars() 26264 1727204263.48780: Calling all_inventory to load vars for managed-node3 26264 1727204263.48783: Calling groups_inventory to load vars for managed-node3 26264 1727204263.48786: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204263.48795: Calling all_plugins_play to load vars for managed-node3 26264 1727204263.48798: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204263.48801: Calling groups_plugins_play to load vars for managed-node3 26264 1727204263.50910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204263.53717: done with get_vars() 26264 1727204263.53760: done getting variables 26264 1727204263.53824: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.982) 0:00:27.387 ***** 26264 1727204263.53866: entering _queue_task() for managed-node3/debug 26264 1727204263.54211: worker is 1 (out of 1 available) 26264 1727204263.54224: exiting _queue_task() for managed-node3/debug 26264 1727204263.54237: done queuing things up, now waiting for results queue to drain 26264 1727204263.54239: waiting for pending results... 26264 1727204263.55136: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 26264 1727204263.55279: in run() - task 0affcd87-79f5-5ff5-08b0-00000000003d 26264 1727204263.55302: variable 'ansible_search_path' from source: unknown 26264 1727204263.55311: variable 'ansible_search_path' from source: unknown 26264 1727204263.55354: calling self._execute() 26264 1727204263.55457: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204263.55473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204263.55488: variable 'omit' from source: magic vars 26264 1727204263.55874: variable 'ansible_distribution_major_version' from source: facts 26264 1727204263.55894: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204263.55910: variable 'omit' from source: magic vars 26264 1727204263.55955: variable 'omit' from source: magic vars 26264 1727204263.56077: variable 'network_provider' from source: set_fact 26264 1727204263.56099: variable 'omit' from source: magic vars 26264 1727204263.56150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204263.56193: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204263.56218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204263.56245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204263.56267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204263.56304: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204263.56314: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204263.56324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204263.56562: Set connection var ansible_pipelining to False 26264 1727204263.56573: Set connection var ansible_connection to ssh 26264 1727204263.56673: Set connection var ansible_shell_type to sh 26264 1727204263.56684: Set connection var ansible_shell_executable to /bin/sh 26264 1727204263.56696: Set connection var ansible_timeout to 10 26264 1727204263.56707: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204263.56736: variable 'ansible_shell_executable' from source: unknown 26264 1727204263.56744: variable 'ansible_connection' from source: unknown 26264 1727204263.56753: variable 'ansible_module_compression' from source: unknown 26264 1727204263.56760: variable 'ansible_shell_type' from source: unknown 26264 1727204263.56770: variable 'ansible_shell_executable' from source: unknown 26264 1727204263.56777: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204263.56783: variable 'ansible_pipelining' from source: unknown 26264 1727204263.56789: variable 'ansible_timeout' from source: unknown 26264 1727204263.56796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204263.57119: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204263.57158: variable 'omit' from source: magic vars 26264 1727204263.57192: starting attempt loop 26264 1727204263.57257: running the handler 26264 1727204263.57306: handler run complete 26264 1727204263.57413: attempt loop complete, returning result 26264 1727204263.57424: _execute() done 26264 1727204263.57431: dumping result to json 26264 1727204263.57438: done dumping result, returning 26264 1727204263.57452: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-5ff5-08b0-00000000003d] 26264 1727204263.57461: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000003d ok: [managed-node3] => {} MSG: Using network provider: nm 26264 1727204263.57624: no more pending results, returning what we have 26264 1727204263.57629: results queue empty 26264 1727204263.57630: checking for any_errors_fatal 26264 1727204263.57639: done checking for any_errors_fatal 26264 1727204263.57640: checking for max_fail_percentage 26264 1727204263.57642: done checking for max_fail_percentage 26264 1727204263.57643: checking to see if all hosts have failed and the running result is not ok 26264 1727204263.57645: done checking to see if all hosts have failed 26264 1727204263.57646: getting the remaining hosts for this loop 26264 1727204263.57650: done getting the remaining hosts for this loop 26264 1727204263.57654: getting the next task for host managed-node3 26264 1727204263.57662: done getting next task for host managed-node3 26264 1727204263.57668: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 26264 1727204263.57670: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204263.57680: getting variables 26264 1727204263.57682: in VariableManager get_vars() 26264 1727204263.57722: Calling all_inventory to load vars for managed-node3 26264 1727204263.57725: Calling groups_inventory to load vars for managed-node3 26264 1727204263.57728: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204263.57739: Calling all_plugins_play to load vars for managed-node3 26264 1727204263.57742: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204263.57745: Calling groups_plugins_play to load vars for managed-node3 26264 1727204263.58801: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000003d 26264 1727204263.58805: WORKER PROCESS EXITING 26264 1727204263.61016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204263.64613: done with get_vars() 26264 1727204263.64653: done getting variables 26264 1727204263.64722: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.108) 0:00:27.496 ***** 26264 1727204263.64756: entering _queue_task() for managed-node3/fail 26264 1727204263.65913: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 26264 1727204263.65932: worker is 1 (out of 1 available) 26264 1727204263.65939: exiting _queue_task() for managed-node3/fail 26264 1727204263.65952: done queuing things up, now waiting for results queue to drain 26264 1727204263.65954: waiting for pending results... 26264 1727204263.66557: in run() - task 0affcd87-79f5-5ff5-08b0-00000000003e 26264 1727204263.66579: variable 'ansible_search_path' from source: unknown 26264 1727204263.66587: variable 'ansible_search_path' from source: unknown 26264 1727204263.66627: calling self._execute() 26264 1727204263.66746: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204263.66760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204263.66776: variable 'omit' from source: magic vars 26264 1727204263.67169: variable 'ansible_distribution_major_version' from source: facts 26264 1727204263.67191: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204263.67328: variable 'network_state' from source: role '' defaults 26264 1727204263.67344: Evaluated conditional (network_state != {}): False 26264 1727204263.67354: when evaluation is False, skipping this task 26264 1727204263.67360: _execute() done 26264 1727204263.67369: dumping result to json 26264 1727204263.67376: done dumping result, returning 26264 1727204263.67385: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-5ff5-08b0-00000000003e] 26264 1727204263.67400: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000003e skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204263.67555: no more pending results, returning what we have 26264 1727204263.67559: results queue empty 26264 1727204263.67560: checking for any_errors_fatal 26264 1727204263.67570: done checking for any_errors_fatal 26264 1727204263.67571: checking for max_fail_percentage 26264 1727204263.67573: done checking for max_fail_percentage 26264 1727204263.67575: checking to see if all hosts have failed and the running result is not ok 26264 1727204263.67576: done checking to see if all hosts have failed 26264 1727204263.67576: getting the remaining hosts for this loop 26264 1727204263.67578: done getting the remaining hosts for this loop 26264 1727204263.67583: getting the next task for host managed-node3 26264 1727204263.67589: done getting next task for host managed-node3 26264 1727204263.67594: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 26264 1727204263.67596: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204263.67611: getting variables 26264 1727204263.67613: in VariableManager get_vars() 26264 1727204263.67657: Calling all_inventory to load vars for managed-node3 26264 1727204263.67660: Calling groups_inventory to load vars for managed-node3 26264 1727204263.67663: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204263.67677: Calling all_plugins_play to load vars for managed-node3 26264 1727204263.67680: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204263.67683: Calling groups_plugins_play to load vars for managed-node3 26264 1727204263.68682: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000003e 26264 1727204263.68686: WORKER PROCESS EXITING 26264 1727204263.69412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204263.71335: done with get_vars() 26264 1727204263.71372: done getting variables 26264 1727204263.71432: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.067) 0:00:27.563 ***** 26264 1727204263.71473: entering _queue_task() for managed-node3/fail 26264 1727204263.71801: worker is 1 (out of 1 available) 26264 1727204263.71813: exiting _queue_task() for managed-node3/fail 26264 1727204263.71825: done queuing things up, now waiting for results queue to drain 26264 1727204263.71827: waiting for pending results... 26264 1727204263.72793: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 26264 1727204263.73630: in run() - task 0affcd87-79f5-5ff5-08b0-00000000003f 26264 1727204263.73650: variable 'ansible_search_path' from source: unknown 26264 1727204263.73712: variable 'ansible_search_path' from source: unknown 26264 1727204263.73753: calling self._execute() 26264 1727204263.73961: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204263.73976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204263.74046: variable 'omit' from source: magic vars 26264 1727204263.74738: variable 'ansible_distribution_major_version' from source: facts 26264 1727204263.74760: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204263.74934: variable 'network_state' from source: role '' defaults 26264 1727204263.74952: Evaluated conditional (network_state != {}): False 26264 1727204263.74960: when evaluation is False, skipping this task 26264 1727204263.74969: _execute() done 26264 1727204263.74977: dumping result to json 26264 1727204263.74989: done dumping result, returning 26264 1727204263.75001: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-5ff5-08b0-00000000003f] 26264 1727204263.75012: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000003f skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204263.75196: no more pending results, returning what we have 26264 1727204263.75201: results queue empty 26264 1727204263.75202: checking for any_errors_fatal 26264 1727204263.75209: done checking for any_errors_fatal 26264 1727204263.75210: checking for max_fail_percentage 26264 1727204263.75212: done checking for max_fail_percentage 26264 1727204263.75213: checking to see if all hosts have failed and the running result is not ok 26264 1727204263.75215: done checking to see if all hosts have failed 26264 1727204263.75215: getting the remaining hosts for this loop 26264 1727204263.75217: done getting the remaining hosts for this loop 26264 1727204263.75222: getting the next task for host managed-node3 26264 1727204263.75231: done getting next task for host managed-node3 26264 1727204263.75236: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 26264 1727204263.75238: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204263.75256: getting variables 26264 1727204263.75258: in VariableManager get_vars() 26264 1727204263.75305: Calling all_inventory to load vars for managed-node3 26264 1727204263.75308: Calling groups_inventory to load vars for managed-node3 26264 1727204263.75311: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204263.75327: Calling all_plugins_play to load vars for managed-node3 26264 1727204263.75330: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204263.75333: Calling groups_plugins_play to load vars for managed-node3 26264 1727204263.76346: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000003f 26264 1727204263.76349: WORKER PROCESS EXITING 26264 1727204263.78194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204263.80600: done with get_vars() 26264 1727204263.80634: done getting variables 26264 1727204263.80708: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.092) 0:00:27.656 ***** 26264 1727204263.80743: entering _queue_task() for managed-node3/fail 26264 1727204263.81392: worker is 1 (out of 1 available) 26264 1727204263.81406: exiting _queue_task() for managed-node3/fail 26264 1727204263.81418: done queuing things up, now waiting for results queue to drain 26264 1727204263.81420: waiting for pending results... 26264 1727204263.81742: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 26264 1727204263.81913: in run() - task 0affcd87-79f5-5ff5-08b0-000000000040 26264 1727204263.81962: variable 'ansible_search_path' from source: unknown 26264 1727204263.81975: variable 'ansible_search_path' from source: unknown 26264 1727204263.82014: calling self._execute() 26264 1727204263.82121: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204263.82135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204263.82151: variable 'omit' from source: magic vars 26264 1727204263.82856: variable 'ansible_distribution_major_version' from source: facts 26264 1727204263.82879: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204263.83355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204263.86359: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204263.86449: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204263.86497: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204263.86539: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204263.86578: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204263.86672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204263.86724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204263.86760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204263.86815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204263.86836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204263.86945: variable 'ansible_distribution_major_version' from source: facts 26264 1727204263.87004: Evaluated conditional (ansible_distribution_major_version | int > 9): False 26264 1727204263.87012: when evaluation is False, skipping this task 26264 1727204263.87107: _execute() done 26264 1727204263.87114: dumping result to json 26264 1727204263.87122: done dumping result, returning 26264 1727204263.87133: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-5ff5-08b0-000000000040] 26264 1727204263.87142: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000040 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 26264 1727204263.87296: no more pending results, returning what we have 26264 1727204263.87300: results queue empty 26264 1727204263.87301: checking for any_errors_fatal 26264 1727204263.87307: done checking for any_errors_fatal 26264 1727204263.87308: checking for max_fail_percentage 26264 1727204263.87310: done checking for max_fail_percentage 26264 1727204263.87310: checking to see if all hosts have failed and the running result is not ok 26264 1727204263.87312: done checking to see if all hosts have failed 26264 1727204263.87313: getting the remaining hosts for this loop 26264 1727204263.87314: done getting the remaining hosts for this loop 26264 1727204263.87319: getting the next task for host managed-node3 26264 1727204263.87326: done getting next task for host managed-node3 26264 1727204263.87332: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 26264 1727204263.87333: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204263.87346: getting variables 26264 1727204263.87347: in VariableManager get_vars() 26264 1727204263.87389: Calling all_inventory to load vars for managed-node3 26264 1727204263.87392: Calling groups_inventory to load vars for managed-node3 26264 1727204263.87394: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204263.87405: Calling all_plugins_play to load vars for managed-node3 26264 1727204263.87407: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204263.87410: Calling groups_plugins_play to load vars for managed-node3 26264 1727204263.88413: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000040 26264 1727204263.88417: WORKER PROCESS EXITING 26264 1727204263.89624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204263.92588: done with get_vars() 26264 1727204263.92631: done getting variables 26264 1727204263.93080: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.123) 0:00:27.780 ***** 26264 1727204263.93116: entering _queue_task() for managed-node3/dnf 26264 1727204263.93614: worker is 1 (out of 1 available) 26264 1727204263.93628: exiting _queue_task() for managed-node3/dnf 26264 1727204263.93640: done queuing things up, now waiting for results queue to drain 26264 1727204263.93641: waiting for pending results... 26264 1727204263.94654: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 26264 1727204263.95354: in run() - task 0affcd87-79f5-5ff5-08b0-000000000041 26264 1727204263.95358: variable 'ansible_search_path' from source: unknown 26264 1727204263.95361: variable 'ansible_search_path' from source: unknown 26264 1727204263.95366: calling self._execute() 26264 1727204263.95698: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204263.95702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204263.95705: variable 'omit' from source: magic vars 26264 1727204263.95890: variable 'ansible_distribution_major_version' from source: facts 26264 1727204263.95902: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204263.96127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204264.00810: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204264.00918: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204264.01171: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204264.01209: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204264.01243: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204264.01321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.01357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.01394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.01438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.01460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.01585: variable 'ansible_distribution' from source: facts 26264 1727204264.01596: variable 'ansible_distribution_major_version' from source: facts 26264 1727204264.01634: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 26264 1727204264.01769: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204264.01913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.01943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.01973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.02018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.02038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.02088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.02121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.02148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.02190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.02207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.02254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.02284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.02314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.02362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.02384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.02575: variable 'network_connections' from source: play vars 26264 1727204264.02591: variable 'profile' from source: play vars 26264 1727204264.02683: variable 'profile' from source: play vars 26264 1727204264.02693: variable 'interface' from source: set_fact 26264 1727204264.02756: variable 'interface' from source: set_fact 26264 1727204264.02868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204264.03118: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204264.03159: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204264.03208: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204264.03243: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204264.03292: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204264.03327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204264.03368: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.03401: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204264.03454: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204264.03709: variable 'network_connections' from source: play vars 26264 1727204264.03721: variable 'profile' from source: play vars 26264 1727204264.03790: variable 'profile' from source: play vars 26264 1727204264.03799: variable 'interface' from source: set_fact 26264 1727204264.03865: variable 'interface' from source: set_fact 26264 1727204264.03897: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26264 1727204264.03905: when evaluation is False, skipping this task 26264 1727204264.03911: _execute() done 26264 1727204264.03917: dumping result to json 26264 1727204264.03924: done dumping result, returning 26264 1727204264.03934: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-5ff5-08b0-000000000041] 26264 1727204264.03943: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000041 26264 1727204264.04062: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000041 26264 1727204264.04071: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26264 1727204264.04311: no more pending results, returning what we have 26264 1727204264.04314: results queue empty 26264 1727204264.04315: checking for any_errors_fatal 26264 1727204264.04321: done checking for any_errors_fatal 26264 1727204264.04321: checking for max_fail_percentage 26264 1727204264.04323: done checking for max_fail_percentage 26264 1727204264.04324: checking to see if all hosts have failed and the running result is not ok 26264 1727204264.04325: done checking to see if all hosts have failed 26264 1727204264.04325: getting the remaining hosts for this loop 26264 1727204264.04326: done getting the remaining hosts for this loop 26264 1727204264.04330: getting the next task for host managed-node3 26264 1727204264.04335: done getting next task for host managed-node3 26264 1727204264.04339: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 26264 1727204264.04341: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204264.04356: getting variables 26264 1727204264.04357: in VariableManager get_vars() 26264 1727204264.04392: Calling all_inventory to load vars for managed-node3 26264 1727204264.04395: Calling groups_inventory to load vars for managed-node3 26264 1727204264.04397: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204264.04406: Calling all_plugins_play to load vars for managed-node3 26264 1727204264.04408: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204264.04411: Calling groups_plugins_play to load vars for managed-node3 26264 1727204264.06481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204264.09212: done with get_vars() 26264 1727204264.09246: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 26264 1727204264.09327: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.162) 0:00:27.942 ***** 26264 1727204264.09358: entering _queue_task() for managed-node3/yum 26264 1727204264.09710: worker is 1 (out of 1 available) 26264 1727204264.09724: exiting _queue_task() for managed-node3/yum 26264 1727204264.09737: done queuing things up, now waiting for results queue to drain 26264 1727204264.09739: waiting for pending results... 26264 1727204264.10030: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 26264 1727204264.10158: in run() - task 0affcd87-79f5-5ff5-08b0-000000000042 26264 1727204264.10186: variable 'ansible_search_path' from source: unknown 26264 1727204264.10194: variable 'ansible_search_path' from source: unknown 26264 1727204264.10235: calling self._execute() 26264 1727204264.10342: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204264.10356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204264.10378: variable 'omit' from source: magic vars 26264 1727204264.11826: variable 'ansible_distribution_major_version' from source: facts 26264 1727204264.11859: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204264.12204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204264.16374: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204264.16458: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204264.16506: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204264.16546: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204264.16587: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204264.16678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.16731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.16768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.16817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.16836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.17702: variable 'ansible_distribution_major_version' from source: facts 26264 1727204264.17724: Evaluated conditional (ansible_distribution_major_version | int < 8): False 26264 1727204264.17732: when evaluation is False, skipping this task 26264 1727204264.17738: _execute() done 26264 1727204264.17744: dumping result to json 26264 1727204264.17755: done dumping result, returning 26264 1727204264.17769: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-5ff5-08b0-000000000042] 26264 1727204264.17780: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000042 26264 1727204264.17897: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000042 26264 1727204264.17905: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 26264 1727204264.17963: no more pending results, returning what we have 26264 1727204264.17969: results queue empty 26264 1727204264.17970: checking for any_errors_fatal 26264 1727204264.17977: done checking for any_errors_fatal 26264 1727204264.17978: checking for max_fail_percentage 26264 1727204264.17980: done checking for max_fail_percentage 26264 1727204264.17981: checking to see if all hosts have failed and the running result is not ok 26264 1727204264.17982: done checking to see if all hosts have failed 26264 1727204264.17983: getting the remaining hosts for this loop 26264 1727204264.17984: done getting the remaining hosts for this loop 26264 1727204264.17989: getting the next task for host managed-node3 26264 1727204264.17996: done getting next task for host managed-node3 26264 1727204264.18001: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 26264 1727204264.18003: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204264.18020: getting variables 26264 1727204264.18022: in VariableManager get_vars() 26264 1727204264.18066: Calling all_inventory to load vars for managed-node3 26264 1727204264.18070: Calling groups_inventory to load vars for managed-node3 26264 1727204264.18072: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204264.18083: Calling all_plugins_play to load vars for managed-node3 26264 1727204264.18086: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204264.18089: Calling groups_plugins_play to load vars for managed-node3 26264 1727204264.20088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204264.22010: done with get_vars() 26264 1727204264.22034: done getting variables 26264 1727204264.22098: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.127) 0:00:28.070 ***** 26264 1727204264.22129: entering _queue_task() for managed-node3/fail 26264 1727204264.22468: worker is 1 (out of 1 available) 26264 1727204264.22482: exiting _queue_task() for managed-node3/fail 26264 1727204264.22498: done queuing things up, now waiting for results queue to drain 26264 1727204264.22500: waiting for pending results... 26264 1727204264.23229: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 26264 1727204264.23451: in run() - task 0affcd87-79f5-5ff5-08b0-000000000043 26264 1727204264.23584: variable 'ansible_search_path' from source: unknown 26264 1727204264.23588: variable 'ansible_search_path' from source: unknown 26264 1727204264.23621: calling self._execute() 26264 1727204264.23823: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204264.23827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204264.23838: variable 'omit' from source: magic vars 26264 1727204264.24685: variable 'ansible_distribution_major_version' from source: facts 26264 1727204264.24699: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204264.24936: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204264.25285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204264.27779: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204264.27854: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204264.27894: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204264.27933: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204264.27959: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204264.28045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.28097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.28124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.28169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.28184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.28242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.28270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.28291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.28334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.28358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.28400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.28428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.28456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.28497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.28517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.29417: variable 'network_connections' from source: play vars 26264 1727204264.29435: variable 'profile' from source: play vars 26264 1727204264.29632: variable 'profile' from source: play vars 26264 1727204264.29642: variable 'interface' from source: set_fact 26264 1727204264.29715: variable 'interface' from source: set_fact 26264 1727204264.29957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204264.30169: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204264.30215: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204264.30254: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204264.30295: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204264.30347: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204264.30379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204264.30414: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.30452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204264.30514: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204264.31426: variable 'network_connections' from source: play vars 26264 1727204264.31454: variable 'profile' from source: play vars 26264 1727204264.31567: variable 'profile' from source: play vars 26264 1727204264.31617: variable 'interface' from source: set_fact 26264 1727204264.31685: variable 'interface' from source: set_fact 26264 1727204264.31797: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26264 1727204264.31833: when evaluation is False, skipping this task 26264 1727204264.31840: _execute() done 26264 1727204264.31944: dumping result to json 26264 1727204264.31953: done dumping result, returning 26264 1727204264.31968: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-5ff5-08b0-000000000043] 26264 1727204264.31990: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000043 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26264 1727204264.32151: no more pending results, returning what we have 26264 1727204264.32156: results queue empty 26264 1727204264.32157: checking for any_errors_fatal 26264 1727204264.32164: done checking for any_errors_fatal 26264 1727204264.32165: checking for max_fail_percentage 26264 1727204264.32167: done checking for max_fail_percentage 26264 1727204264.32167: checking to see if all hosts have failed and the running result is not ok 26264 1727204264.32168: done checking to see if all hosts have failed 26264 1727204264.32169: getting the remaining hosts for this loop 26264 1727204264.32171: done getting the remaining hosts for this loop 26264 1727204264.32177: getting the next task for host managed-node3 26264 1727204264.32183: done getting next task for host managed-node3 26264 1727204264.32188: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 26264 1727204264.32190: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204264.32208: getting variables 26264 1727204264.32210: in VariableManager get_vars() 26264 1727204264.32252: Calling all_inventory to load vars for managed-node3 26264 1727204264.32256: Calling groups_inventory to load vars for managed-node3 26264 1727204264.32258: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204264.32272: Calling all_plugins_play to load vars for managed-node3 26264 1727204264.32274: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204264.32278: Calling groups_plugins_play to load vars for managed-node3 26264 1727204264.32900: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000043 26264 1727204264.32903: WORKER PROCESS EXITING 26264 1727204264.34886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204264.39057: done with get_vars() 26264 1727204264.39099: done getting variables 26264 1727204264.39265: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.171) 0:00:28.242 ***** 26264 1727204264.39301: entering _queue_task() for managed-node3/package 26264 1727204264.41107: worker is 1 (out of 1 available) 26264 1727204264.41182: exiting _queue_task() for managed-node3/package 26264 1727204264.41195: done queuing things up, now waiting for results queue to drain 26264 1727204264.41197: waiting for pending results... 26264 1727204264.41674: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 26264 1727204264.41871: in run() - task 0affcd87-79f5-5ff5-08b0-000000000044 26264 1727204264.41885: variable 'ansible_search_path' from source: unknown 26264 1727204264.41889: variable 'ansible_search_path' from source: unknown 26264 1727204264.42041: calling self._execute() 26264 1727204264.42340: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204264.42345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204264.42360: variable 'omit' from source: magic vars 26264 1727204264.44045: variable 'ansible_distribution_major_version' from source: facts 26264 1727204264.44062: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204264.44885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204264.45544: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204264.45588: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204264.45623: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204264.45837: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204264.46150: variable 'network_packages' from source: role '' defaults 26264 1727204264.46386: variable '__network_provider_setup' from source: role '' defaults 26264 1727204264.46399: variable '__network_service_name_default_nm' from source: role '' defaults 26264 1727204264.46591: variable '__network_service_name_default_nm' from source: role '' defaults 26264 1727204264.46600: variable '__network_packages_default_nm' from source: role '' defaults 26264 1727204264.46779: variable '__network_packages_default_nm' from source: role '' defaults 26264 1727204264.47070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204264.54622: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204264.55241: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204264.55287: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204264.55444: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204264.55475: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204264.55601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.55633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.55669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.55708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.55723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.55885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.55907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.55929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.56075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.56097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.56675: variable '__network_packages_default_gobject_packages' from source: role '' defaults 26264 1727204264.56948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.56983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.57010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.57045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.57184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.57400: variable 'ansible_python' from source: facts 26264 1727204264.57429: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 26264 1727204264.57632: variable '__network_wpa_supplicant_required' from source: role '' defaults 26264 1727204264.57960: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26264 1727204264.58199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.58222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.58245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.58405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.58419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.58470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.58612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.58638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.58682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.58696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.59085: variable 'network_connections' from source: play vars 26264 1727204264.59091: variable 'profile' from source: play vars 26264 1727204264.59310: variable 'profile' from source: play vars 26264 1727204264.59318: variable 'interface' from source: set_fact 26264 1727204264.59504: variable 'interface' from source: set_fact 26264 1727204264.60307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204264.60336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204264.60367: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.61115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204264.61172: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204264.62153: variable 'network_connections' from source: play vars 26264 1727204264.62162: variable 'profile' from source: play vars 26264 1727204264.62389: variable 'profile' from source: play vars 26264 1727204264.62396: variable 'interface' from source: set_fact 26264 1727204264.62587: variable 'interface' from source: set_fact 26264 1727204264.62625: variable '__network_packages_default_wireless' from source: role '' defaults 26264 1727204264.62824: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204264.63721: variable 'network_connections' from source: play vars 26264 1727204264.63725: variable 'profile' from source: play vars 26264 1727204264.63919: variable 'profile' from source: play vars 26264 1727204264.63923: variable 'interface' from source: set_fact 26264 1727204264.64151: variable 'interface' from source: set_fact 26264 1727204264.64185: variable '__network_packages_default_team' from source: role '' defaults 26264 1727204264.64392: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204264.65257: variable 'network_connections' from source: play vars 26264 1727204264.65263: variable 'profile' from source: play vars 26264 1727204264.65410: variable 'profile' from source: play vars 26264 1727204264.65414: variable 'interface' from source: set_fact 26264 1727204264.65526: variable 'interface' from source: set_fact 26264 1727204264.65727: variable '__network_service_name_default_initscripts' from source: role '' defaults 26264 1727204264.65981: variable '__network_service_name_default_initscripts' from source: role '' defaults 26264 1727204264.65988: variable '__network_packages_default_initscripts' from source: role '' defaults 26264 1727204264.66049: variable '__network_packages_default_initscripts' from source: role '' defaults 26264 1727204264.66757: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 26264 1727204264.68701: variable 'network_connections' from source: play vars 26264 1727204264.68705: variable 'profile' from source: play vars 26264 1727204264.68773: variable 'profile' from source: play vars 26264 1727204264.68781: variable 'interface' from source: set_fact 26264 1727204264.68842: variable 'interface' from source: set_fact 26264 1727204264.68850: variable 'ansible_distribution' from source: facts 26264 1727204264.68857: variable '__network_rh_distros' from source: role '' defaults 26264 1727204264.68862: variable 'ansible_distribution_major_version' from source: facts 26264 1727204264.68880: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 26264 1727204264.69067: variable 'ansible_distribution' from source: facts 26264 1727204264.69071: variable '__network_rh_distros' from source: role '' defaults 26264 1727204264.69078: variable 'ansible_distribution_major_version' from source: facts 26264 1727204264.69093: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 26264 1727204264.69314: variable 'ansible_distribution' from source: facts 26264 1727204264.69319: variable '__network_rh_distros' from source: role '' defaults 26264 1727204264.69323: variable 'ansible_distribution_major_version' from source: facts 26264 1727204264.69325: variable 'network_provider' from source: set_fact 26264 1727204264.69347: variable 'ansible_facts' from source: unknown 26264 1727204264.70427: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 26264 1727204264.70432: when evaluation is False, skipping this task 26264 1727204264.70434: _execute() done 26264 1727204264.70437: dumping result to json 26264 1727204264.70439: done dumping result, returning 26264 1727204264.70448: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-5ff5-08b0-000000000044] 26264 1727204264.70457: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000044 skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 26264 1727204264.70610: no more pending results, returning what we have 26264 1727204264.70615: results queue empty 26264 1727204264.70616: checking for any_errors_fatal 26264 1727204264.70625: done checking for any_errors_fatal 26264 1727204264.70626: checking for max_fail_percentage 26264 1727204264.70628: done checking for max_fail_percentage 26264 1727204264.70629: checking to see if all hosts have failed and the running result is not ok 26264 1727204264.70630: done checking to see if all hosts have failed 26264 1727204264.70631: getting the remaining hosts for this loop 26264 1727204264.70632: done getting the remaining hosts for this loop 26264 1727204264.70637: getting the next task for host managed-node3 26264 1727204264.70645: done getting next task for host managed-node3 26264 1727204264.70653: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 26264 1727204264.70655: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204264.70667: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000044 26264 1727204264.70677: WORKER PROCESS EXITING 26264 1727204264.70683: getting variables 26264 1727204264.70685: in VariableManager get_vars() 26264 1727204264.70729: Calling all_inventory to load vars for managed-node3 26264 1727204264.70733: Calling groups_inventory to load vars for managed-node3 26264 1727204264.70736: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204264.70756: Calling all_plugins_play to load vars for managed-node3 26264 1727204264.70760: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204264.70763: Calling groups_plugins_play to load vars for managed-node3 26264 1727204264.72818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204264.75252: done with get_vars() 26264 1727204264.75288: done getting variables 26264 1727204264.75346: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.360) 0:00:28.603 ***** 26264 1727204264.75385: entering _queue_task() for managed-node3/package 26264 1727204264.75724: worker is 1 (out of 1 available) 26264 1727204264.75738: exiting _queue_task() for managed-node3/package 26264 1727204264.75756: done queuing things up, now waiting for results queue to drain 26264 1727204264.75757: waiting for pending results... 26264 1727204264.76214: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 26264 1727204264.76440: in run() - task 0affcd87-79f5-5ff5-08b0-000000000045 26264 1727204264.76456: variable 'ansible_search_path' from source: unknown 26264 1727204264.76460: variable 'ansible_search_path' from source: unknown 26264 1727204264.76614: calling self._execute() 26264 1727204264.76827: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204264.76834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204264.76844: variable 'omit' from source: magic vars 26264 1727204264.77613: variable 'ansible_distribution_major_version' from source: facts 26264 1727204264.77627: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204264.77878: variable 'network_state' from source: role '' defaults 26264 1727204264.77891: Evaluated conditional (network_state != {}): False 26264 1727204264.77894: when evaluation is False, skipping this task 26264 1727204264.77897: _execute() done 26264 1727204264.77900: dumping result to json 26264 1727204264.77903: done dumping result, returning 26264 1727204264.77919: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-5ff5-08b0-000000000045] 26264 1727204264.77924: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000045 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204264.78101: no more pending results, returning what we have 26264 1727204264.78106: results queue empty 26264 1727204264.78107: checking for any_errors_fatal 26264 1727204264.78116: done checking for any_errors_fatal 26264 1727204264.78117: checking for max_fail_percentage 26264 1727204264.78119: done checking for max_fail_percentage 26264 1727204264.78120: checking to see if all hosts have failed and the running result is not ok 26264 1727204264.78121: done checking to see if all hosts have failed 26264 1727204264.78122: getting the remaining hosts for this loop 26264 1727204264.78123: done getting the remaining hosts for this loop 26264 1727204264.78128: getting the next task for host managed-node3 26264 1727204264.78135: done getting next task for host managed-node3 26264 1727204264.78139: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 26264 1727204264.78142: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204264.78166: getting variables 26264 1727204264.78168: in VariableManager get_vars() 26264 1727204264.78212: Calling all_inventory to load vars for managed-node3 26264 1727204264.78216: Calling groups_inventory to load vars for managed-node3 26264 1727204264.78218: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204264.78235: Calling all_plugins_play to load vars for managed-node3 26264 1727204264.78238: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204264.78242: Calling groups_plugins_play to load vars for managed-node3 26264 1727204264.78763: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000045 26264 1727204264.78767: WORKER PROCESS EXITING 26264 1727204264.80033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204264.82906: done with get_vars() 26264 1727204264.82945: done getting variables 26264 1727204264.83016: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.076) 0:00:28.679 ***** 26264 1727204264.83056: entering _queue_task() for managed-node3/package 26264 1727204264.83414: worker is 1 (out of 1 available) 26264 1727204264.83427: exiting _queue_task() for managed-node3/package 26264 1727204264.83440: done queuing things up, now waiting for results queue to drain 26264 1727204264.83442: waiting for pending results... 26264 1727204264.83744: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 26264 1727204264.83861: in run() - task 0affcd87-79f5-5ff5-08b0-000000000046 26264 1727204264.83876: variable 'ansible_search_path' from source: unknown 26264 1727204264.83880: variable 'ansible_search_path' from source: unknown 26264 1727204264.83922: calling self._execute() 26264 1727204264.84022: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204264.84027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204264.84038: variable 'omit' from source: magic vars 26264 1727204264.84447: variable 'ansible_distribution_major_version' from source: facts 26264 1727204264.84463: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204264.84593: variable 'network_state' from source: role '' defaults 26264 1727204264.84602: Evaluated conditional (network_state != {}): False 26264 1727204264.84605: when evaluation is False, skipping this task 26264 1727204264.84609: _execute() done 26264 1727204264.84611: dumping result to json 26264 1727204264.84613: done dumping result, returning 26264 1727204264.84621: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-5ff5-08b0-000000000046] 26264 1727204264.84628: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000046 26264 1727204264.84733: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000046 26264 1727204264.84736: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204264.84805: no more pending results, returning what we have 26264 1727204264.84809: results queue empty 26264 1727204264.84809: checking for any_errors_fatal 26264 1727204264.84818: done checking for any_errors_fatal 26264 1727204264.84819: checking for max_fail_percentage 26264 1727204264.84820: done checking for max_fail_percentage 26264 1727204264.84821: checking to see if all hosts have failed and the running result is not ok 26264 1727204264.84822: done checking to see if all hosts have failed 26264 1727204264.84823: getting the remaining hosts for this loop 26264 1727204264.84825: done getting the remaining hosts for this loop 26264 1727204264.84829: getting the next task for host managed-node3 26264 1727204264.84836: done getting next task for host managed-node3 26264 1727204264.84839: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 26264 1727204264.84841: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204264.84858: getting variables 26264 1727204264.84860: in VariableManager get_vars() 26264 1727204264.84902: Calling all_inventory to load vars for managed-node3 26264 1727204264.84905: Calling groups_inventory to load vars for managed-node3 26264 1727204264.84907: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204264.84920: Calling all_plugins_play to load vars for managed-node3 26264 1727204264.84923: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204264.84926: Calling groups_plugins_play to load vars for managed-node3 26264 1727204264.86763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204264.88437: done with get_vars() 26264 1727204264.88471: done getting variables 26264 1727204264.88538: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.055) 0:00:28.734 ***** 26264 1727204264.88579: entering _queue_task() for managed-node3/service 26264 1727204264.89540: worker is 1 (out of 1 available) 26264 1727204264.89557: exiting _queue_task() for managed-node3/service 26264 1727204264.89572: done queuing things up, now waiting for results queue to drain 26264 1727204264.89574: waiting for pending results... 26264 1727204264.90416: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 26264 1727204264.90701: in run() - task 0affcd87-79f5-5ff5-08b0-000000000047 26264 1727204264.90784: variable 'ansible_search_path' from source: unknown 26264 1727204264.90793: variable 'ansible_search_path' from source: unknown 26264 1727204264.90834: calling self._execute() 26264 1727204264.90975: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204264.90989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204264.91002: variable 'omit' from source: magic vars 26264 1727204264.91392: variable 'ansible_distribution_major_version' from source: facts 26264 1727204264.91408: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204264.91545: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204264.91750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204264.94590: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204264.94653: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204264.94700: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204264.94734: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204264.94761: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204264.94843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.94893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.94926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.94971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.94986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.95037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.95062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.95088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.95133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.95148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.95192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204264.95214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204264.95247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.95290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204264.95304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204264.95492: variable 'network_connections' from source: play vars 26264 1727204264.95506: variable 'profile' from source: play vars 26264 1727204264.95591: variable 'profile' from source: play vars 26264 1727204264.95595: variable 'interface' from source: set_fact 26264 1727204264.95661: variable 'interface' from source: set_fact 26264 1727204264.95738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204264.95918: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204264.95951: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204264.95985: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204264.96020: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204264.96066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204264.96088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204264.96118: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204264.96145: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204264.96197: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204264.96464: variable 'network_connections' from source: play vars 26264 1727204264.96471: variable 'profile' from source: play vars 26264 1727204264.96527: variable 'profile' from source: play vars 26264 1727204264.96530: variable 'interface' from source: set_fact 26264 1727204264.96600: variable 'interface' from source: set_fact 26264 1727204264.96625: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26264 1727204264.96629: when evaluation is False, skipping this task 26264 1727204264.96631: _execute() done 26264 1727204264.96634: dumping result to json 26264 1727204264.96636: done dumping result, returning 26264 1727204264.96650: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-5ff5-08b0-000000000047] 26264 1727204264.96660: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000047 26264 1727204264.96752: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000047 26264 1727204264.96756: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26264 1727204264.96809: no more pending results, returning what we have 26264 1727204264.96813: results queue empty 26264 1727204264.96814: checking for any_errors_fatal 26264 1727204264.96823: done checking for any_errors_fatal 26264 1727204264.96824: checking for max_fail_percentage 26264 1727204264.96827: done checking for max_fail_percentage 26264 1727204264.96828: checking to see if all hosts have failed and the running result is not ok 26264 1727204264.96829: done checking to see if all hosts have failed 26264 1727204264.96830: getting the remaining hosts for this loop 26264 1727204264.96832: done getting the remaining hosts for this loop 26264 1727204264.96837: getting the next task for host managed-node3 26264 1727204264.96843: done getting next task for host managed-node3 26264 1727204264.96850: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 26264 1727204264.96852: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204264.96867: getting variables 26264 1727204264.96869: in VariableManager get_vars() 26264 1727204264.96907: Calling all_inventory to load vars for managed-node3 26264 1727204264.96910: Calling groups_inventory to load vars for managed-node3 26264 1727204264.96913: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204264.96924: Calling all_plugins_play to load vars for managed-node3 26264 1727204264.96926: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204264.96929: Calling groups_plugins_play to load vars for managed-node3 26264 1727204264.98651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204265.00439: done with get_vars() 26264 1727204265.00473: done getting variables 26264 1727204265.00538: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.119) 0:00:28.854 ***** 26264 1727204265.00575: entering _queue_task() for managed-node3/service 26264 1727204265.00932: worker is 1 (out of 1 available) 26264 1727204265.00947: exiting _queue_task() for managed-node3/service 26264 1727204265.00965: done queuing things up, now waiting for results queue to drain 26264 1727204265.00967: waiting for pending results... 26264 1727204265.01262: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 26264 1727204265.01368: in run() - task 0affcd87-79f5-5ff5-08b0-000000000048 26264 1727204265.01382: variable 'ansible_search_path' from source: unknown 26264 1727204265.01386: variable 'ansible_search_path' from source: unknown 26264 1727204265.01428: calling self._execute() 26264 1727204265.01534: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204265.01538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204265.01548: variable 'omit' from source: magic vars 26264 1727204265.01968: variable 'ansible_distribution_major_version' from source: facts 26264 1727204265.01981: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204265.02149: variable 'network_provider' from source: set_fact 26264 1727204265.02156: variable 'network_state' from source: role '' defaults 26264 1727204265.02171: Evaluated conditional (network_provider == "nm" or network_state != {}): True 26264 1727204265.02179: variable 'omit' from source: magic vars 26264 1727204265.02221: variable 'omit' from source: magic vars 26264 1727204265.02249: variable 'network_service_name' from source: role '' defaults 26264 1727204265.02329: variable 'network_service_name' from source: role '' defaults 26264 1727204265.02442: variable '__network_provider_setup' from source: role '' defaults 26264 1727204265.02446: variable '__network_service_name_default_nm' from source: role '' defaults 26264 1727204265.02518: variable '__network_service_name_default_nm' from source: role '' defaults 26264 1727204265.02526: variable '__network_packages_default_nm' from source: role '' defaults 26264 1727204265.02591: variable '__network_packages_default_nm' from source: role '' defaults 26264 1727204265.02830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204265.11400: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204265.11468: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204265.11505: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204265.11539: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204265.11568: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204265.11634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204265.11668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204265.11693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204265.11732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204265.11748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204265.11794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204265.11815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204265.11839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204265.11879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204265.11892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204265.12112: variable '__network_packages_default_gobject_packages' from source: role '' defaults 26264 1727204265.12244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204265.12275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204265.12301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204265.12340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204265.12357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204265.12458: variable 'ansible_python' from source: facts 26264 1727204265.12481: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 26264 1727204265.12574: variable '__network_wpa_supplicant_required' from source: role '' defaults 26264 1727204265.12658: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26264 1727204265.12792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204265.12820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204265.12849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204265.12888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204265.12903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204265.12957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204265.12982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204265.13007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204265.13051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204265.13070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204265.13216: variable 'network_connections' from source: play vars 26264 1727204265.13222: variable 'profile' from source: play vars 26264 1727204265.13304: variable 'profile' from source: play vars 26264 1727204265.13309: variable 'interface' from source: set_fact 26264 1727204265.13376: variable 'interface' from source: set_fact 26264 1727204265.13483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204265.13685: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204265.13734: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204265.13778: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204265.13821: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204265.13886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204265.13920: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204265.13951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204265.13989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204265.14030: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204265.14321: variable 'network_connections' from source: play vars 26264 1727204265.14329: variable 'profile' from source: play vars 26264 1727204265.14415: variable 'profile' from source: play vars 26264 1727204265.14420: variable 'interface' from source: set_fact 26264 1727204265.14489: variable 'interface' from source: set_fact 26264 1727204265.14523: variable '__network_packages_default_wireless' from source: role '' defaults 26264 1727204265.14609: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204265.14935: variable 'network_connections' from source: play vars 26264 1727204265.14940: variable 'profile' from source: play vars 26264 1727204265.15028: variable 'profile' from source: play vars 26264 1727204265.15031: variable 'interface' from source: set_fact 26264 1727204265.15109: variable 'interface' from source: set_fact 26264 1727204265.15133: variable '__network_packages_default_team' from source: role '' defaults 26264 1727204265.15214: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204265.15513: variable 'network_connections' from source: play vars 26264 1727204265.15516: variable 'profile' from source: play vars 26264 1727204265.15594: variable 'profile' from source: play vars 26264 1727204265.15598: variable 'interface' from source: set_fact 26264 1727204265.15678: variable 'interface' from source: set_fact 26264 1727204265.15727: variable '__network_service_name_default_initscripts' from source: role '' defaults 26264 1727204265.15788: variable '__network_service_name_default_initscripts' from source: role '' defaults 26264 1727204265.15794: variable '__network_packages_default_initscripts' from source: role '' defaults 26264 1727204265.15851: variable '__network_packages_default_initscripts' from source: role '' defaults 26264 1727204265.16268: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 26264 1727204265.16811: variable 'network_connections' from source: play vars 26264 1727204265.16815: variable 'profile' from source: play vars 26264 1727204265.16915: variable 'profile' from source: play vars 26264 1727204265.16918: variable 'interface' from source: set_fact 26264 1727204265.16995: variable 'interface' from source: set_fact 26264 1727204265.17001: variable 'ansible_distribution' from source: facts 26264 1727204265.17004: variable '__network_rh_distros' from source: role '' defaults 26264 1727204265.17014: variable 'ansible_distribution_major_version' from source: facts 26264 1727204265.17029: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 26264 1727204265.17242: variable 'ansible_distribution' from source: facts 26264 1727204265.17245: variable '__network_rh_distros' from source: role '' defaults 26264 1727204265.17247: variable 'ansible_distribution_major_version' from source: facts 26264 1727204265.17250: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 26264 1727204265.17472: variable 'ansible_distribution' from source: facts 26264 1727204265.17475: variable '__network_rh_distros' from source: role '' defaults 26264 1727204265.17477: variable 'ansible_distribution_major_version' from source: facts 26264 1727204265.17479: variable 'network_provider' from source: set_fact 26264 1727204265.17581: variable 'omit' from source: magic vars 26264 1727204265.17584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204265.17587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204265.17589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204265.17591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204265.17594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204265.17596: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204265.17598: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204265.17599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204265.17701: Set connection var ansible_pipelining to False 26264 1727204265.17704: Set connection var ansible_connection to ssh 26264 1727204265.17707: Set connection var ansible_shell_type to sh 26264 1727204265.17712: Set connection var ansible_shell_executable to /bin/sh 26264 1727204265.17721: Set connection var ansible_timeout to 10 26264 1727204265.17728: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204265.17758: variable 'ansible_shell_executable' from source: unknown 26264 1727204265.17761: variable 'ansible_connection' from source: unknown 26264 1727204265.17766: variable 'ansible_module_compression' from source: unknown 26264 1727204265.17769: variable 'ansible_shell_type' from source: unknown 26264 1727204265.17772: variable 'ansible_shell_executable' from source: unknown 26264 1727204265.17780: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204265.17787: variable 'ansible_pipelining' from source: unknown 26264 1727204265.17789: variable 'ansible_timeout' from source: unknown 26264 1727204265.17793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204265.17889: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204265.17898: variable 'omit' from source: magic vars 26264 1727204265.17904: starting attempt loop 26264 1727204265.17907: running the handler 26264 1727204265.17985: variable 'ansible_facts' from source: unknown 26264 1727204265.18808: _low_level_execute_command(): starting 26264 1727204265.18869: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204265.19779: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204265.19788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204265.19830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204265.19835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204265.19848: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204265.19858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204265.19861: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204265.19954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204265.19960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204265.19998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204265.20042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204265.21694: stdout chunk (state=3): >>>/root <<< 26264 1727204265.21771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204265.21851: stderr chunk (state=3): >>><<< 26264 1727204265.21855: stdout chunk (state=3): >>><<< 26264 1727204265.21884: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204265.21898: _low_level_execute_command(): starting 26264 1727204265.21905: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204265.2188578-28689-218800483174617 `" && echo ansible-tmp-1727204265.2188578-28689-218800483174617="` echo /root/.ansible/tmp/ansible-tmp-1727204265.2188578-28689-218800483174617 `" ) && sleep 0' 26264 1727204265.22992: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204265.22995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204265.23037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204265.23042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204265.23058: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204265.23068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204265.23157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204265.23163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204265.23256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204265.23328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204265.25138: stdout chunk (state=3): >>>ansible-tmp-1727204265.2188578-28689-218800483174617=/root/.ansible/tmp/ansible-tmp-1727204265.2188578-28689-218800483174617 <<< 26264 1727204265.25310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204265.25340: stderr chunk (state=3): >>><<< 26264 1727204265.25344: stdout chunk (state=3): >>><<< 26264 1727204265.25371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204265.2188578-28689-218800483174617=/root/.ansible/tmp/ansible-tmp-1727204265.2188578-28689-218800483174617 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204265.25673: variable 'ansible_module_compression' from source: unknown 26264 1727204265.25676: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 26264 1727204265.25682: variable 'ansible_facts' from source: unknown 26264 1727204265.25709: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204265.2188578-28689-218800483174617/AnsiballZ_systemd.py 26264 1727204265.25878: Sending initial data 26264 1727204265.25882: Sent initial data (156 bytes) 26264 1727204265.26852: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204265.26872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204265.26890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204265.26909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204265.26951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204265.26967: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204265.26986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204265.27004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204265.27016: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204265.27026: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204265.27037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204265.27052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204265.27073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204265.27088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204265.27105: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204265.27118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204265.27202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204265.27225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204265.27238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204265.27382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204265.29040: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204265.29081: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204265.29124: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmplbdyoti2 /root/.ansible/tmp/ansible-tmp-1727204265.2188578-28689-218800483174617/AnsiballZ_systemd.py <<< 26264 1727204265.29179: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204265.31471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204265.31716: stderr chunk (state=3): >>><<< 26264 1727204265.31720: stdout chunk (state=3): >>><<< 26264 1727204265.31722: done transferring module to remote 26264 1727204265.31724: _low_level_execute_command(): starting 26264 1727204265.31727: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204265.2188578-28689-218800483174617/ /root/.ansible/tmp/ansible-tmp-1727204265.2188578-28689-218800483174617/AnsiballZ_systemd.py && sleep 0' 26264 1727204265.32348: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204265.32368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204265.32396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204265.32417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204265.32462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204265.32479: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204265.32497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204265.32520: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204265.32534: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204265.32554: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204265.32573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204265.32588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204265.32607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204265.32626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204265.32638: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204265.32652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204265.32735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204265.32758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204265.32770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204265.32837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204265.34517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204265.34597: stderr chunk (state=3): >>><<< 26264 1727204265.34600: stdout chunk (state=3): >>><<< 26264 1727204265.34621: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204265.34624: _low_level_execute_command(): starting 26264 1727204265.34629: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204265.2188578-28689-218800483174617/AnsiballZ_systemd.py && sleep 0' 26264 1727204265.36096: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204265.36158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204265.36270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204265.36291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204265.36338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204265.36356: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204265.36375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204265.36394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204265.36407: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204265.36418: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204265.36431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204265.36445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204265.36468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204265.36484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204265.36497: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204265.36512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204265.36682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204265.36710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204265.36728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204265.36816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204265.61731: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 26264 1727204265.61774: stdout chunk (state=3): >>>service", "ControlGroupId": "2418", "MemoryCurrent": "16183296", "MemoryAvailable": "infinity", "CPUUsageNSec": "1527541000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 26264 1727204265.63362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204265.63372: stdout chunk (state=3): >>><<< 26264 1727204265.63375: stderr chunk (state=3): >>><<< 26264 1727204265.63573: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "16183296", "MemoryAvailable": "infinity", "CPUUsageNSec": "1527541000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204265.63595: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204265.2188578-28689-218800483174617/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204265.63619: _low_level_execute_command(): starting 26264 1727204265.63631: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204265.2188578-28689-218800483174617/ > /dev/null 2>&1 && sleep 0' 26264 1727204265.65144: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204265.65168: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204265.65184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204265.65210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204265.65256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204265.65320: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204265.65383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204265.65404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204265.65423: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204265.65436: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204265.65451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204265.65468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204265.65485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204265.65497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204265.65509: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204265.65523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204265.65716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204265.65741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204265.65770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204265.65844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204265.67716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204265.67720: stdout chunk (state=3): >>><<< 26264 1727204265.67722: stderr chunk (state=3): >>><<< 26264 1727204265.67973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204265.67977: handler run complete 26264 1727204265.67979: attempt loop complete, returning result 26264 1727204265.67981: _execute() done 26264 1727204265.67983: dumping result to json 26264 1727204265.67985: done dumping result, returning 26264 1727204265.67987: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-5ff5-08b0-000000000048] 26264 1727204265.67989: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000048 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204265.68187: no more pending results, returning what we have 26264 1727204265.68472: results queue empty 26264 1727204265.68474: checking for any_errors_fatal 26264 1727204265.68483: done checking for any_errors_fatal 26264 1727204265.68484: checking for max_fail_percentage 26264 1727204265.68486: done checking for max_fail_percentage 26264 1727204265.68487: checking to see if all hosts have failed and the running result is not ok 26264 1727204265.68488: done checking to see if all hosts have failed 26264 1727204265.68488: getting the remaining hosts for this loop 26264 1727204265.68490: done getting the remaining hosts for this loop 26264 1727204265.68494: getting the next task for host managed-node3 26264 1727204265.68500: done getting next task for host managed-node3 26264 1727204265.68505: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 26264 1727204265.68507: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204265.68518: getting variables 26264 1727204265.68520: in VariableManager get_vars() 26264 1727204265.68563: Calling all_inventory to load vars for managed-node3 26264 1727204265.68568: Calling groups_inventory to load vars for managed-node3 26264 1727204265.68571: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204265.68583: Calling all_plugins_play to load vars for managed-node3 26264 1727204265.68585: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204265.68588: Calling groups_plugins_play to load vars for managed-node3 26264 1727204265.78550: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000048 26264 1727204265.78555: WORKER PROCESS EXITING 26264 1727204265.80819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204265.84906: done with get_vars() 26264 1727204265.84942: done getting variables 26264 1727204265.85121: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.845) 0:00:29.700 ***** 26264 1727204265.85157: entering _queue_task() for managed-node3/service 26264 1727204265.85584: worker is 1 (out of 1 available) 26264 1727204265.85596: exiting _queue_task() for managed-node3/service 26264 1727204265.85610: done queuing things up, now waiting for results queue to drain 26264 1727204265.85612: waiting for pending results... 26264 1727204265.85990: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 26264 1727204265.86128: in run() - task 0affcd87-79f5-5ff5-08b0-000000000049 26264 1727204265.86153: variable 'ansible_search_path' from source: unknown 26264 1727204265.86183: variable 'ansible_search_path' from source: unknown 26264 1727204265.86269: calling self._execute() 26264 1727204265.86556: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204265.86571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204265.86597: variable 'omit' from source: magic vars 26264 1727204265.87033: variable 'ansible_distribution_major_version' from source: facts 26264 1727204265.87057: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204265.87197: variable 'network_provider' from source: set_fact 26264 1727204265.87209: Evaluated conditional (network_provider == "nm"): True 26264 1727204265.87431: variable '__network_wpa_supplicant_required' from source: role '' defaults 26264 1727204265.87605: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26264 1727204265.88014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204265.91296: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204265.92040: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204265.92113: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204265.92278: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204265.92315: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204265.92518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204265.92598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204265.92632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204265.92689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204265.92708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204265.92759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204265.92794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204265.92822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204265.92871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204265.92898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204265.92942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204265.92978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204265.93015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204265.93061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204265.93082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204265.93261: variable 'network_connections' from source: play vars 26264 1727204265.93282: variable 'profile' from source: play vars 26264 1727204265.93376: variable 'profile' from source: play vars 26264 1727204265.93386: variable 'interface' from source: set_fact 26264 1727204265.93459: variable 'interface' from source: set_fact 26264 1727204265.93544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204265.93906: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204265.93950: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204265.93991: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204265.94024: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204265.94081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204265.94108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204265.94138: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204265.94176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204265.94225: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204265.94455: variable 'network_connections' from source: play vars 26264 1727204265.94469: variable 'profile' from source: play vars 26264 1727204265.94540: variable 'profile' from source: play vars 26264 1727204265.94550: variable 'interface' from source: set_fact 26264 1727204265.94608: variable 'interface' from source: set_fact 26264 1727204265.94644: Evaluated conditional (__network_wpa_supplicant_required): False 26264 1727204265.94655: when evaluation is False, skipping this task 26264 1727204265.94662: _execute() done 26264 1727204265.94682: dumping result to json 26264 1727204265.94690: done dumping result, returning 26264 1727204265.94702: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-5ff5-08b0-000000000049] 26264 1727204265.94712: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000049 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 26264 1727204265.94877: no more pending results, returning what we have 26264 1727204265.94881: results queue empty 26264 1727204265.94882: checking for any_errors_fatal 26264 1727204265.94903: done checking for any_errors_fatal 26264 1727204265.94903: checking for max_fail_percentage 26264 1727204265.94905: done checking for max_fail_percentage 26264 1727204265.94906: checking to see if all hosts have failed and the running result is not ok 26264 1727204265.94907: done checking to see if all hosts have failed 26264 1727204265.94908: getting the remaining hosts for this loop 26264 1727204265.94910: done getting the remaining hosts for this loop 26264 1727204265.94914: getting the next task for host managed-node3 26264 1727204265.94920: done getting next task for host managed-node3 26264 1727204265.94925: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 26264 1727204265.94927: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204265.94941: getting variables 26264 1727204265.94943: in VariableManager get_vars() 26264 1727204265.94986: Calling all_inventory to load vars for managed-node3 26264 1727204265.94989: Calling groups_inventory to load vars for managed-node3 26264 1727204265.94992: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204265.95003: Calling all_plugins_play to load vars for managed-node3 26264 1727204265.95006: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204265.95008: Calling groups_plugins_play to load vars for managed-node3 26264 1727204265.96081: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000049 26264 1727204265.96085: WORKER PROCESS EXITING 26264 1727204265.96975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204265.99923: done with get_vars() 26264 1727204265.99954: done getting variables 26264 1727204266.00019: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.148) 0:00:29.849 ***** 26264 1727204266.00054: entering _queue_task() for managed-node3/service 26264 1727204266.00785: worker is 1 (out of 1 available) 26264 1727204266.00798: exiting _queue_task() for managed-node3/service 26264 1727204266.00812: done queuing things up, now waiting for results queue to drain 26264 1727204266.00814: waiting for pending results... 26264 1727204266.01421: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 26264 1727204266.01545: in run() - task 0affcd87-79f5-5ff5-08b0-00000000004a 26264 1727204266.01691: variable 'ansible_search_path' from source: unknown 26264 1727204266.01699: variable 'ansible_search_path' from source: unknown 26264 1727204266.01739: calling self._execute() 26264 1727204266.01930: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204266.01941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204266.01959: variable 'omit' from source: magic vars 26264 1727204266.02367: variable 'ansible_distribution_major_version' from source: facts 26264 1727204266.02387: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204266.02517: variable 'network_provider' from source: set_fact 26264 1727204266.02528: Evaluated conditional (network_provider == "initscripts"): False 26264 1727204266.02535: when evaluation is False, skipping this task 26264 1727204266.02541: _execute() done 26264 1727204266.02552: dumping result to json 26264 1727204266.02565: done dumping result, returning 26264 1727204266.02577: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-5ff5-08b0-00000000004a] 26264 1727204266.02589: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000004a skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204266.02738: no more pending results, returning what we have 26264 1727204266.02742: results queue empty 26264 1727204266.02743: checking for any_errors_fatal 26264 1727204266.02755: done checking for any_errors_fatal 26264 1727204266.02756: checking for max_fail_percentage 26264 1727204266.02758: done checking for max_fail_percentage 26264 1727204266.02759: checking to see if all hosts have failed and the running result is not ok 26264 1727204266.02761: done checking to see if all hosts have failed 26264 1727204266.02761: getting the remaining hosts for this loop 26264 1727204266.02763: done getting the remaining hosts for this loop 26264 1727204266.02769: getting the next task for host managed-node3 26264 1727204266.02777: done getting next task for host managed-node3 26264 1727204266.02782: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 26264 1727204266.02784: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204266.02800: getting variables 26264 1727204266.02802: in VariableManager get_vars() 26264 1727204266.02845: Calling all_inventory to load vars for managed-node3 26264 1727204266.02851: Calling groups_inventory to load vars for managed-node3 26264 1727204266.02854: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204266.02870: Calling all_plugins_play to load vars for managed-node3 26264 1727204266.02873: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204266.02877: Calling groups_plugins_play to load vars for managed-node3 26264 1727204266.04386: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000004a 26264 1727204266.04390: WORKER PROCESS EXITING 26264 1727204266.04658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204266.07096: done with get_vars() 26264 1727204266.07128: done getting variables 26264 1727204266.07196: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.071) 0:00:29.921 ***** 26264 1727204266.07233: entering _queue_task() for managed-node3/copy 26264 1727204266.08270: worker is 1 (out of 1 available) 26264 1727204266.08284: exiting _queue_task() for managed-node3/copy 26264 1727204266.08296: done queuing things up, now waiting for results queue to drain 26264 1727204266.08298: waiting for pending results... 26264 1727204266.09095: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 26264 1727204266.09359: in run() - task 0affcd87-79f5-5ff5-08b0-00000000004b 26264 1727204266.09397: variable 'ansible_search_path' from source: unknown 26264 1727204266.09456: variable 'ansible_search_path' from source: unknown 26264 1727204266.09503: calling self._execute() 26264 1727204266.09658: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204266.09780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204266.09794: variable 'omit' from source: magic vars 26264 1727204266.10660: variable 'ansible_distribution_major_version' from source: facts 26264 1727204266.10685: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204266.10984: variable 'network_provider' from source: set_fact 26264 1727204266.10996: Evaluated conditional (network_provider == "initscripts"): False 26264 1727204266.11005: when evaluation is False, skipping this task 26264 1727204266.11013: _execute() done 26264 1727204266.11025: dumping result to json 26264 1727204266.11033: done dumping result, returning 26264 1727204266.11045: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-5ff5-08b0-00000000004b] 26264 1727204266.11091: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000004b skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 26264 1727204266.11261: no more pending results, returning what we have 26264 1727204266.11268: results queue empty 26264 1727204266.11270: checking for any_errors_fatal 26264 1727204266.11277: done checking for any_errors_fatal 26264 1727204266.11278: checking for max_fail_percentage 26264 1727204266.11280: done checking for max_fail_percentage 26264 1727204266.11281: checking to see if all hosts have failed and the running result is not ok 26264 1727204266.11283: done checking to see if all hosts have failed 26264 1727204266.11283: getting the remaining hosts for this loop 26264 1727204266.11285: done getting the remaining hosts for this loop 26264 1727204266.11290: getting the next task for host managed-node3 26264 1727204266.11297: done getting next task for host managed-node3 26264 1727204266.11301: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 26264 1727204266.11304: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204266.11319: getting variables 26264 1727204266.11321: in VariableManager get_vars() 26264 1727204266.11367: Calling all_inventory to load vars for managed-node3 26264 1727204266.11370: Calling groups_inventory to load vars for managed-node3 26264 1727204266.11372: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204266.11386: Calling all_plugins_play to load vars for managed-node3 26264 1727204266.11388: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204266.11391: Calling groups_plugins_play to load vars for managed-node3 26264 1727204266.12272: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000004b 26264 1727204266.12276: WORKER PROCESS EXITING 26264 1727204266.13509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204266.17130: done with get_vars() 26264 1727204266.17167: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.100) 0:00:30.021 ***** 26264 1727204266.17252: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 26264 1727204266.17785: worker is 1 (out of 1 available) 26264 1727204266.17797: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 26264 1727204266.17809: done queuing things up, now waiting for results queue to drain 26264 1727204266.17811: waiting for pending results... 26264 1727204266.18654: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 26264 1727204266.18938: in run() - task 0affcd87-79f5-5ff5-08b0-00000000004c 26264 1727204266.18962: variable 'ansible_search_path' from source: unknown 26264 1727204266.19046: variable 'ansible_search_path' from source: unknown 26264 1727204266.19091: calling self._execute() 26264 1727204266.19306: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204266.19319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204266.19336: variable 'omit' from source: magic vars 26264 1727204266.20185: variable 'ansible_distribution_major_version' from source: facts 26264 1727204266.20203: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204266.20214: variable 'omit' from source: magic vars 26264 1727204266.20595: variable 'omit' from source: magic vars 26264 1727204266.20768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204266.23357: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204266.23570: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204266.23659: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204266.23767: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204266.23868: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204266.24088: variable 'network_provider' from source: set_fact 26264 1727204266.24342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204266.24503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204266.24535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204266.24587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204266.24688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204266.24775: variable 'omit' from source: magic vars 26264 1727204266.24988: variable 'omit' from source: magic vars 26264 1727204266.25355: variable 'network_connections' from source: play vars 26264 1727204266.25376: variable 'profile' from source: play vars 26264 1727204266.25446: variable 'profile' from source: play vars 26264 1727204266.25578: variable 'interface' from source: set_fact 26264 1727204266.25641: variable 'interface' from source: set_fact 26264 1727204266.25895: variable 'omit' from source: magic vars 26264 1727204266.25981: variable '__lsr_ansible_managed' from source: task vars 26264 1727204266.26108: variable '__lsr_ansible_managed' from source: task vars 26264 1727204266.26635: Loaded config def from plugin (lookup/template) 26264 1727204266.26778: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 26264 1727204266.26812: File lookup term: get_ansible_managed.j2 26264 1727204266.26820: variable 'ansible_search_path' from source: unknown 26264 1727204266.26830: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 26264 1727204266.26847: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 26264 1727204266.26899: variable 'ansible_search_path' from source: unknown 26264 1727204266.36903: variable 'ansible_managed' from source: unknown 26264 1727204266.37366: variable 'omit' from source: magic vars 26264 1727204266.37515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204266.37551: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204266.37577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204266.37719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204266.37734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204266.37773: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204266.37783: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204266.37791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204266.37894: Set connection var ansible_pipelining to False 26264 1727204266.38031: Set connection var ansible_connection to ssh 26264 1727204266.38039: Set connection var ansible_shell_type to sh 26264 1727204266.38054: Set connection var ansible_shell_executable to /bin/sh 26264 1727204266.38070: Set connection var ansible_timeout to 10 26264 1727204266.38081: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204266.38111: variable 'ansible_shell_executable' from source: unknown 26264 1727204266.38252: variable 'ansible_connection' from source: unknown 26264 1727204266.38261: variable 'ansible_module_compression' from source: unknown 26264 1727204266.38271: variable 'ansible_shell_type' from source: unknown 26264 1727204266.38278: variable 'ansible_shell_executable' from source: unknown 26264 1727204266.38285: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204266.38293: variable 'ansible_pipelining' from source: unknown 26264 1727204266.38299: variable 'ansible_timeout' from source: unknown 26264 1727204266.38307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204266.38454: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204266.38593: variable 'omit' from source: magic vars 26264 1727204266.38604: starting attempt loop 26264 1727204266.38610: running the handler 26264 1727204266.38626: _low_level_execute_command(): starting 26264 1727204266.38698: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204266.39955: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204266.39972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204266.39985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204266.40004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204266.40045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204266.40058: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204266.40074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204266.40091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204266.40106: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204266.40116: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204266.40128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204266.40140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204266.40157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204266.40170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204266.40181: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204266.40195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204266.40276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204266.40298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204266.40313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204266.40396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204266.42024: stdout chunk (state=3): >>>/root <<< 26264 1727204266.42183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204266.42237: stderr chunk (state=3): >>><<< 26264 1727204266.42241: stdout chunk (state=3): >>><<< 26264 1727204266.42357: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204266.42360: _low_level_execute_command(): starting 26264 1727204266.42365: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204266.4226475-28840-18249512992740 `" && echo ansible-tmp-1727204266.4226475-28840-18249512992740="` echo /root/.ansible/tmp/ansible-tmp-1727204266.4226475-28840-18249512992740 `" ) && sleep 0' 26264 1727204266.43717: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204266.43721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204266.43831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204266.43876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204266.43933: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204266.43945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204266.43961: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204266.43975: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204266.43986: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204266.43999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204266.44013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204266.44033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204266.44131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204266.44145: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204266.44158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204266.44237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204266.44290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204266.44305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204266.44431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204266.46272: stdout chunk (state=3): >>>ansible-tmp-1727204266.4226475-28840-18249512992740=/root/.ansible/tmp/ansible-tmp-1727204266.4226475-28840-18249512992740 <<< 26264 1727204266.46469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204266.46473: stdout chunk (state=3): >>><<< 26264 1727204266.46476: stderr chunk (state=3): >>><<< 26264 1727204266.46571: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204266.4226475-28840-18249512992740=/root/.ansible/tmp/ansible-tmp-1727204266.4226475-28840-18249512992740 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204266.46579: variable 'ansible_module_compression' from source: unknown 26264 1727204266.46673: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 26264 1727204266.46676: variable 'ansible_facts' from source: unknown 26264 1727204266.46733: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204266.4226475-28840-18249512992740/AnsiballZ_network_connections.py 26264 1727204266.47355: Sending initial data 26264 1727204266.47358: Sent initial data (167 bytes) 26264 1727204266.50151: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204266.50156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204266.50192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204266.50195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204266.50198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204266.50476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204266.50500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204266.50518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204266.50646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204266.52368: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204266.52401: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204266.52460: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpjq3dvfwq /root/.ansible/tmp/ansible-tmp-1727204266.4226475-28840-18249512992740/AnsiballZ_network_connections.py <<< 26264 1727204266.52497: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204266.54390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204266.54643: stderr chunk (state=3): >>><<< 26264 1727204266.54650: stdout chunk (state=3): >>><<< 26264 1727204266.54654: done transferring module to remote 26264 1727204266.54656: _low_level_execute_command(): starting 26264 1727204266.54659: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204266.4226475-28840-18249512992740/ /root/.ansible/tmp/ansible-tmp-1727204266.4226475-28840-18249512992740/AnsiballZ_network_connections.py && sleep 0' 26264 1727204266.56104: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204266.56117: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204266.56129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204266.56144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204266.56196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204266.56207: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204266.56218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204266.56233: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204266.56243: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204266.56255: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204266.56266: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204266.56282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204266.56295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204266.56305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204266.56314: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204266.56324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204266.56406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204266.56425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204266.56452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204266.56832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204266.58660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204266.58666: stdout chunk (state=3): >>><<< 26264 1727204266.58668: stderr chunk (state=3): >>><<< 26264 1727204266.58771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204266.58775: _low_level_execute_command(): starting 26264 1727204266.58778: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204266.4226475-28840-18249512992740/AnsiballZ_network_connections.py && sleep 0' 26264 1727204266.60236: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204266.60251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204266.60268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204266.60311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204266.60441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204266.60453: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204266.60468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204266.60485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204266.60497: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204266.60517: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204266.60530: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204266.60543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204266.60559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204266.60573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204266.60584: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204266.60597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204266.60682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204266.60858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204266.60877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204266.61076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204266.87880: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 26264 1727204266.89798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204266.89803: stdout chunk (state=3): >>><<< 26264 1727204266.89805: stderr chunk (state=3): >>><<< 26264 1727204266.89871: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204266.89970: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204266.4226475-28840-18249512992740/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204266.89973: _low_level_execute_command(): starting 26264 1727204266.89976: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204266.4226475-28840-18249512992740/ > /dev/null 2>&1 && sleep 0' 26264 1727204266.90930: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204266.90934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204266.90973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204266.90977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 26264 1727204266.90979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204266.90981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204266.91049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204266.91052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204266.91055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204266.91116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204266.92890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204266.92969: stderr chunk (state=3): >>><<< 26264 1727204266.92973: stdout chunk (state=3): >>><<< 26264 1727204266.93418: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204266.93426: handler run complete 26264 1727204266.93428: attempt loop complete, returning result 26264 1727204266.93430: _execute() done 26264 1727204266.93432: dumping result to json 26264 1727204266.93434: done dumping result, returning 26264 1727204266.93436: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-5ff5-08b0-00000000004c] 26264 1727204266.93438: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000004c 26264 1727204266.93516: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000004c 26264 1727204266.93519: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 26264 1727204266.93617: no more pending results, returning what we have 26264 1727204266.93620: results queue empty 26264 1727204266.93621: checking for any_errors_fatal 26264 1727204266.93628: done checking for any_errors_fatal 26264 1727204266.93630: checking for max_fail_percentage 26264 1727204266.93632: done checking for max_fail_percentage 26264 1727204266.93633: checking to see if all hosts have failed and the running result is not ok 26264 1727204266.93634: done checking to see if all hosts have failed 26264 1727204266.93635: getting the remaining hosts for this loop 26264 1727204266.93636: done getting the remaining hosts for this loop 26264 1727204266.93640: getting the next task for host managed-node3 26264 1727204266.93645: done getting next task for host managed-node3 26264 1727204266.93651: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 26264 1727204266.93653: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204266.93661: getting variables 26264 1727204266.93663: in VariableManager get_vars() 26264 1727204266.93703: Calling all_inventory to load vars for managed-node3 26264 1727204266.93706: Calling groups_inventory to load vars for managed-node3 26264 1727204266.93708: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204266.93717: Calling all_plugins_play to load vars for managed-node3 26264 1727204266.93720: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204266.93722: Calling groups_plugins_play to load vars for managed-node3 26264 1727204266.95584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204266.97452: done with get_vars() 26264 1727204266.97479: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.803) 0:00:30.824 ***** 26264 1727204266.97568: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 26264 1727204266.97900: worker is 1 (out of 1 available) 26264 1727204266.97915: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 26264 1727204266.97928: done queuing things up, now waiting for results queue to drain 26264 1727204266.97930: waiting for pending results... 26264 1727204266.98220: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 26264 1727204266.98327: in run() - task 0affcd87-79f5-5ff5-08b0-00000000004d 26264 1727204266.98341: variable 'ansible_search_path' from source: unknown 26264 1727204266.98345: variable 'ansible_search_path' from source: unknown 26264 1727204266.98387: calling self._execute() 26264 1727204266.98474: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204266.98485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204266.98495: variable 'omit' from source: magic vars 26264 1727204266.98861: variable 'ansible_distribution_major_version' from source: facts 26264 1727204266.98874: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204266.98993: variable 'network_state' from source: role '' defaults 26264 1727204266.99003: Evaluated conditional (network_state != {}): False 26264 1727204266.99006: when evaluation is False, skipping this task 26264 1727204266.99009: _execute() done 26264 1727204266.99012: dumping result to json 26264 1727204266.99014: done dumping result, returning 26264 1727204266.99026: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-5ff5-08b0-00000000004d] 26264 1727204266.99032: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000004d 26264 1727204266.99123: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000004d 26264 1727204266.99126: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204266.99179: no more pending results, returning what we have 26264 1727204266.99183: results queue empty 26264 1727204266.99184: checking for any_errors_fatal 26264 1727204266.99193: done checking for any_errors_fatal 26264 1727204266.99194: checking for max_fail_percentage 26264 1727204266.99196: done checking for max_fail_percentage 26264 1727204266.99197: checking to see if all hosts have failed and the running result is not ok 26264 1727204266.99198: done checking to see if all hosts have failed 26264 1727204266.99199: getting the remaining hosts for this loop 26264 1727204266.99200: done getting the remaining hosts for this loop 26264 1727204266.99205: getting the next task for host managed-node3 26264 1727204266.99211: done getting next task for host managed-node3 26264 1727204266.99215: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 26264 1727204266.99217: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204266.99233: getting variables 26264 1727204266.99235: in VariableManager get_vars() 26264 1727204266.99274: Calling all_inventory to load vars for managed-node3 26264 1727204266.99277: Calling groups_inventory to load vars for managed-node3 26264 1727204266.99279: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204266.99292: Calling all_plugins_play to load vars for managed-node3 26264 1727204266.99295: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204266.99297: Calling groups_plugins_play to load vars for managed-node3 26264 1727204267.00877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204267.02520: done with get_vars() 26264 1727204267.02551: done getting variables 26264 1727204267.02618: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.050) 0:00:30.875 ***** 26264 1727204267.02652: entering _queue_task() for managed-node3/debug 26264 1727204267.02986: worker is 1 (out of 1 available) 26264 1727204267.03000: exiting _queue_task() for managed-node3/debug 26264 1727204267.03014: done queuing things up, now waiting for results queue to drain 26264 1727204267.03015: waiting for pending results... 26264 1727204267.03308: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 26264 1727204267.03433: in run() - task 0affcd87-79f5-5ff5-08b0-00000000004e 26264 1727204267.03457: variable 'ansible_search_path' from source: unknown 26264 1727204267.03471: variable 'ansible_search_path' from source: unknown 26264 1727204267.03512: calling self._execute() 26264 1727204267.03615: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204267.03628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204267.03644: variable 'omit' from source: magic vars 26264 1727204267.04053: variable 'ansible_distribution_major_version' from source: facts 26264 1727204267.04074: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204267.04086: variable 'omit' from source: magic vars 26264 1727204267.04137: variable 'omit' from source: magic vars 26264 1727204267.04182: variable 'omit' from source: magic vars 26264 1727204267.04232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204267.04276: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204267.04304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204267.04329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204267.04347: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204267.04380: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204267.04388: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204267.04394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204267.04493: Set connection var ansible_pipelining to False 26264 1727204267.04500: Set connection var ansible_connection to ssh 26264 1727204267.04505: Set connection var ansible_shell_type to sh 26264 1727204267.04513: Set connection var ansible_shell_executable to /bin/sh 26264 1727204267.04523: Set connection var ansible_timeout to 10 26264 1727204267.04532: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204267.04561: variable 'ansible_shell_executable' from source: unknown 26264 1727204267.04570: variable 'ansible_connection' from source: unknown 26264 1727204267.04577: variable 'ansible_module_compression' from source: unknown 26264 1727204267.04582: variable 'ansible_shell_type' from source: unknown 26264 1727204267.04587: variable 'ansible_shell_executable' from source: unknown 26264 1727204267.04591: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204267.04597: variable 'ansible_pipelining' from source: unknown 26264 1727204267.04602: variable 'ansible_timeout' from source: unknown 26264 1727204267.04608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204267.04742: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204267.04759: variable 'omit' from source: magic vars 26264 1727204267.04774: starting attempt loop 26264 1727204267.04781: running the handler 26264 1727204267.04908: variable '__network_connections_result' from source: set_fact 26264 1727204267.04959: handler run complete 26264 1727204267.04987: attempt loop complete, returning result 26264 1727204267.04994: _execute() done 26264 1727204267.05000: dumping result to json 26264 1727204267.05006: done dumping result, returning 26264 1727204267.05017: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-5ff5-08b0-00000000004e] 26264 1727204267.05024: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000004e ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 26264 1727204267.05171: no more pending results, returning what we have 26264 1727204267.05175: results queue empty 26264 1727204267.05176: checking for any_errors_fatal 26264 1727204267.05183: done checking for any_errors_fatal 26264 1727204267.05183: checking for max_fail_percentage 26264 1727204267.05185: done checking for max_fail_percentage 26264 1727204267.05186: checking to see if all hosts have failed and the running result is not ok 26264 1727204267.05187: done checking to see if all hosts have failed 26264 1727204267.05188: getting the remaining hosts for this loop 26264 1727204267.05190: done getting the remaining hosts for this loop 26264 1727204267.05194: getting the next task for host managed-node3 26264 1727204267.05201: done getting next task for host managed-node3 26264 1727204267.05206: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 26264 1727204267.05209: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204267.05219: getting variables 26264 1727204267.05220: in VariableManager get_vars() 26264 1727204267.05261: Calling all_inventory to load vars for managed-node3 26264 1727204267.05267: Calling groups_inventory to load vars for managed-node3 26264 1727204267.05270: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204267.05282: Calling all_plugins_play to load vars for managed-node3 26264 1727204267.05285: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204267.05288: Calling groups_plugins_play to load vars for managed-node3 26264 1727204267.06391: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000004e 26264 1727204267.06395: WORKER PROCESS EXITING 26264 1727204267.07208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204267.08834: done with get_vars() 26264 1727204267.08867: done getting variables 26264 1727204267.08931: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.063) 0:00:30.938 ***** 26264 1727204267.08966: entering _queue_task() for managed-node3/debug 26264 1727204267.09306: worker is 1 (out of 1 available) 26264 1727204267.09320: exiting _queue_task() for managed-node3/debug 26264 1727204267.09334: done queuing things up, now waiting for results queue to drain 26264 1727204267.09335: waiting for pending results... 26264 1727204267.09623: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 26264 1727204267.09752: in run() - task 0affcd87-79f5-5ff5-08b0-00000000004f 26264 1727204267.09777: variable 'ansible_search_path' from source: unknown 26264 1727204267.09788: variable 'ansible_search_path' from source: unknown 26264 1727204267.09829: calling self._execute() 26264 1727204267.09930: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204267.09942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204267.09956: variable 'omit' from source: magic vars 26264 1727204267.10347: variable 'ansible_distribution_major_version' from source: facts 26264 1727204267.10366: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204267.10377: variable 'omit' from source: magic vars 26264 1727204267.10417: variable 'omit' from source: magic vars 26264 1727204267.10460: variable 'omit' from source: magic vars 26264 1727204267.10503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204267.10548: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204267.10575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204267.10596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204267.10610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204267.10643: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204267.10654: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204267.10661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204267.10763: Set connection var ansible_pipelining to False 26264 1727204267.10774: Set connection var ansible_connection to ssh 26264 1727204267.10780: Set connection var ansible_shell_type to sh 26264 1727204267.10790: Set connection var ansible_shell_executable to /bin/sh 26264 1727204267.10800: Set connection var ansible_timeout to 10 26264 1727204267.10810: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204267.10838: variable 'ansible_shell_executable' from source: unknown 26264 1727204267.10845: variable 'ansible_connection' from source: unknown 26264 1727204267.10851: variable 'ansible_module_compression' from source: unknown 26264 1727204267.10857: variable 'ansible_shell_type' from source: unknown 26264 1727204267.10864: variable 'ansible_shell_executable' from source: unknown 26264 1727204267.10875: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204267.10884: variable 'ansible_pipelining' from source: unknown 26264 1727204267.10890: variable 'ansible_timeout' from source: unknown 26264 1727204267.10895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204267.11027: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204267.11044: variable 'omit' from source: magic vars 26264 1727204267.11055: starting attempt loop 26264 1727204267.11062: running the handler 26264 1727204267.11119: variable '__network_connections_result' from source: set_fact 26264 1727204267.11207: variable '__network_connections_result' from source: set_fact 26264 1727204267.11318: handler run complete 26264 1727204267.11350: attempt loop complete, returning result 26264 1727204267.11358: _execute() done 26264 1727204267.11367: dumping result to json 26264 1727204267.11377: done dumping result, returning 26264 1727204267.11389: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-5ff5-08b0-00000000004f] 26264 1727204267.11398: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000004f 26264 1727204267.11514: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000004f ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 26264 1727204267.11600: no more pending results, returning what we have 26264 1727204267.11605: results queue empty 26264 1727204267.11606: checking for any_errors_fatal 26264 1727204267.11614: done checking for any_errors_fatal 26264 1727204267.11615: checking for max_fail_percentage 26264 1727204267.11617: done checking for max_fail_percentage 26264 1727204267.11618: checking to see if all hosts have failed and the running result is not ok 26264 1727204267.11620: done checking to see if all hosts have failed 26264 1727204267.11621: getting the remaining hosts for this loop 26264 1727204267.11623: done getting the remaining hosts for this loop 26264 1727204267.11629: getting the next task for host managed-node3 26264 1727204267.11636: done getting next task for host managed-node3 26264 1727204267.11640: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 26264 1727204267.11643: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204267.11653: getting variables 26264 1727204267.11655: in VariableManager get_vars() 26264 1727204267.11697: Calling all_inventory to load vars for managed-node3 26264 1727204267.11701: Calling groups_inventory to load vars for managed-node3 26264 1727204267.11703: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204267.11716: Calling all_plugins_play to load vars for managed-node3 26264 1727204267.11720: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204267.11723: Calling groups_plugins_play to load vars for managed-node3 26264 1727204267.12682: WORKER PROCESS EXITING 26264 1727204267.13425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204267.15105: done with get_vars() 26264 1727204267.15136: done getting variables 26264 1727204267.15201: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.062) 0:00:31.001 ***** 26264 1727204267.15241: entering _queue_task() for managed-node3/debug 26264 1727204267.15577: worker is 1 (out of 1 available) 26264 1727204267.15591: exiting _queue_task() for managed-node3/debug 26264 1727204267.15605: done queuing things up, now waiting for results queue to drain 26264 1727204267.15606: waiting for pending results... 26264 1727204267.15898: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 26264 1727204267.16024: in run() - task 0affcd87-79f5-5ff5-08b0-000000000050 26264 1727204267.16045: variable 'ansible_search_path' from source: unknown 26264 1727204267.16054: variable 'ansible_search_path' from source: unknown 26264 1727204267.16094: calling self._execute() 26264 1727204267.16190: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204267.16200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204267.16212: variable 'omit' from source: magic vars 26264 1727204267.16621: variable 'ansible_distribution_major_version' from source: facts 26264 1727204267.16637: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204267.16760: variable 'network_state' from source: role '' defaults 26264 1727204267.16776: Evaluated conditional (network_state != {}): False 26264 1727204267.16783: when evaluation is False, skipping this task 26264 1727204267.16788: _execute() done 26264 1727204267.16794: dumping result to json 26264 1727204267.16800: done dumping result, returning 26264 1727204267.16815: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-5ff5-08b0-000000000050] 26264 1727204267.16826: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000050 skipping: [managed-node3] => { "false_condition": "network_state != {}" } 26264 1727204267.16977: no more pending results, returning what we have 26264 1727204267.16981: results queue empty 26264 1727204267.16982: checking for any_errors_fatal 26264 1727204267.16993: done checking for any_errors_fatal 26264 1727204267.16994: checking for max_fail_percentage 26264 1727204267.16996: done checking for max_fail_percentage 26264 1727204267.16997: checking to see if all hosts have failed and the running result is not ok 26264 1727204267.16998: done checking to see if all hosts have failed 26264 1727204267.16999: getting the remaining hosts for this loop 26264 1727204267.17001: done getting the remaining hosts for this loop 26264 1727204267.17005: getting the next task for host managed-node3 26264 1727204267.17012: done getting next task for host managed-node3 26264 1727204267.17017: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 26264 1727204267.17020: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204267.17035: getting variables 26264 1727204267.17037: in VariableManager get_vars() 26264 1727204267.17079: Calling all_inventory to load vars for managed-node3 26264 1727204267.17082: Calling groups_inventory to load vars for managed-node3 26264 1727204267.17085: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204267.17099: Calling all_plugins_play to load vars for managed-node3 26264 1727204267.17102: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204267.17105: Calling groups_plugins_play to load vars for managed-node3 26264 1727204267.18086: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000050 26264 1727204267.18090: WORKER PROCESS EXITING 26264 1727204267.19017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204267.20665: done with get_vars() 26264 1727204267.20699: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.055) 0:00:31.057 ***** 26264 1727204267.20801: entering _queue_task() for managed-node3/ping 26264 1727204267.21137: worker is 1 (out of 1 available) 26264 1727204267.21150: exiting _queue_task() for managed-node3/ping 26264 1727204267.21162: done queuing things up, now waiting for results queue to drain 26264 1727204267.21165: waiting for pending results... 26264 1727204267.21453: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 26264 1727204267.21586: in run() - task 0affcd87-79f5-5ff5-08b0-000000000051 26264 1727204267.21612: variable 'ansible_search_path' from source: unknown 26264 1727204267.21620: variable 'ansible_search_path' from source: unknown 26264 1727204267.21663: calling self._execute() 26264 1727204267.21769: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204267.21782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204267.21796: variable 'omit' from source: magic vars 26264 1727204267.22171: variable 'ansible_distribution_major_version' from source: facts 26264 1727204267.22191: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204267.22202: variable 'omit' from source: magic vars 26264 1727204267.22246: variable 'omit' from source: magic vars 26264 1727204267.22287: variable 'omit' from source: magic vars 26264 1727204267.22329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204267.22374: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204267.22399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204267.22420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204267.22436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204267.22472: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204267.22480: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204267.22486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204267.22585: Set connection var ansible_pipelining to False 26264 1727204267.22592: Set connection var ansible_connection to ssh 26264 1727204267.22597: Set connection var ansible_shell_type to sh 26264 1727204267.22605: Set connection var ansible_shell_executable to /bin/sh 26264 1727204267.22615: Set connection var ansible_timeout to 10 26264 1727204267.22624: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204267.22651: variable 'ansible_shell_executable' from source: unknown 26264 1727204267.22658: variable 'ansible_connection' from source: unknown 26264 1727204267.22667: variable 'ansible_module_compression' from source: unknown 26264 1727204267.22680: variable 'ansible_shell_type' from source: unknown 26264 1727204267.22691: variable 'ansible_shell_executable' from source: unknown 26264 1727204267.22698: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204267.22706: variable 'ansible_pipelining' from source: unknown 26264 1727204267.22712: variable 'ansible_timeout' from source: unknown 26264 1727204267.22720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204267.22932: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204267.22949: variable 'omit' from source: magic vars 26264 1727204267.22960: starting attempt loop 26264 1727204267.22970: running the handler 26264 1727204267.23012: _low_level_execute_command(): starting 26264 1727204267.23027: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204267.23823: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204267.23840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.23856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.23885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.23931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.23944: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204267.23959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.23980: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204267.23991: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204267.24006: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204267.24019: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.24036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.24052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.24068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.24081: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204267.24097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.24184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204267.24201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204267.24225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204267.24455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204267.26020: stdout chunk (state=3): >>>/root <<< 26264 1727204267.26233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204267.26237: stdout chunk (state=3): >>><<< 26264 1727204267.26239: stderr chunk (state=3): >>><<< 26264 1727204267.26371: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204267.26381: _low_level_execute_command(): starting 26264 1727204267.26385: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204267.2626624-28911-4247959126627 `" && echo ansible-tmp-1727204267.2626624-28911-4247959126627="` echo /root/.ansible/tmp/ansible-tmp-1727204267.2626624-28911-4247959126627 `" ) && sleep 0' 26264 1727204267.27032: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204267.27046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.27063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.27082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.27122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.27132: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204267.27147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.27168: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204267.27179: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204267.27192: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204267.27209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.27221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.27236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.27248: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.27258: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204267.27275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.27351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204267.27379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204267.27396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204267.27469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204267.29298: stdout chunk (state=3): >>>ansible-tmp-1727204267.2626624-28911-4247959126627=/root/.ansible/tmp/ansible-tmp-1727204267.2626624-28911-4247959126627 <<< 26264 1727204267.29511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204267.29515: stdout chunk (state=3): >>><<< 26264 1727204267.29518: stderr chunk (state=3): >>><<< 26264 1727204267.29678: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204267.2626624-28911-4247959126627=/root/.ansible/tmp/ansible-tmp-1727204267.2626624-28911-4247959126627 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204267.29682: variable 'ansible_module_compression' from source: unknown 26264 1727204267.29685: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 26264 1727204267.29786: variable 'ansible_facts' from source: unknown 26264 1727204267.29789: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204267.2626624-28911-4247959126627/AnsiballZ_ping.py 26264 1727204267.30603: Sending initial data 26264 1727204267.30606: Sent initial data (151 bytes) 26264 1727204267.31906: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.31910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.31927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.32018: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204267.32029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.32044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204267.32054: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204267.32061: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204267.32074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.32084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.32096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.32104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.32111: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204267.32121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.32194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204267.32214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204267.32226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204267.32301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204267.33988: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204267.34024: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204267.34072: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmp2k9el448 /root/.ansible/tmp/ansible-tmp-1727204267.2626624-28911-4247959126627/AnsiballZ_ping.py <<< 26264 1727204267.34106: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204267.35475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204267.35517: stderr chunk (state=3): >>><<< 26264 1727204267.35520: stdout chunk (state=3): >>><<< 26264 1727204267.35543: done transferring module to remote 26264 1727204267.35555: _low_level_execute_command(): starting 26264 1727204267.35561: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204267.2626624-28911-4247959126627/ /root/.ansible/tmp/ansible-tmp-1727204267.2626624-28911-4247959126627/AnsiballZ_ping.py && sleep 0' 26264 1727204267.36289: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.36295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.36400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204267.36407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 26264 1727204267.36413: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.36427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.36434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.36513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204267.36517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204267.36533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204267.36609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204267.38390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204267.38393: stdout chunk (state=3): >>><<< 26264 1727204267.38396: stderr chunk (state=3): >>><<< 26264 1727204267.38495: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204267.38499: _low_level_execute_command(): starting 26264 1727204267.38501: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204267.2626624-28911-4247959126627/AnsiballZ_ping.py && sleep 0' 26264 1727204267.39071: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204267.39087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.39102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.39121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.39170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.39184: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204267.39200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.39218: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204267.39232: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204267.39245: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204267.39261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.39277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.39291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.39301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.39309: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204267.39320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.39400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204267.39415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204267.39427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204267.39512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204267.52376: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 26264 1727204267.53417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204267.53421: stdout chunk (state=3): >>><<< 26264 1727204267.53426: stderr chunk (state=3): >>><<< 26264 1727204267.53443: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204267.53468: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204267.2626624-28911-4247959126627/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204267.53478: _low_level_execute_command(): starting 26264 1727204267.53483: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204267.2626624-28911-4247959126627/ > /dev/null 2>&1 && sleep 0' 26264 1727204267.54105: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204267.54113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.54125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.54137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.54179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.54186: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204267.54195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.54208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204267.54215: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204267.54221: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204267.54229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.54236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.54251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.54254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.54261: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204267.54273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.54357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204267.54367: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204267.54385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204267.54432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204267.56274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204267.56298: stdout chunk (state=3): >>><<< 26264 1727204267.56301: stderr chunk (state=3): >>><<< 26264 1727204267.56474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204267.56479: handler run complete 26264 1727204267.56482: attempt loop complete, returning result 26264 1727204267.56484: _execute() done 26264 1727204267.56487: dumping result to json 26264 1727204267.56489: done dumping result, returning 26264 1727204267.56491: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-5ff5-08b0-000000000051] 26264 1727204267.56493: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000051 26264 1727204267.56572: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000051 26264 1727204267.56577: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 26264 1727204267.56644: no more pending results, returning what we have 26264 1727204267.56651: results queue empty 26264 1727204267.56652: checking for any_errors_fatal 26264 1727204267.56661: done checking for any_errors_fatal 26264 1727204267.56662: checking for max_fail_percentage 26264 1727204267.56667: done checking for max_fail_percentage 26264 1727204267.56668: checking to see if all hosts have failed and the running result is not ok 26264 1727204267.56670: done checking to see if all hosts have failed 26264 1727204267.56670: getting the remaining hosts for this loop 26264 1727204267.56672: done getting the remaining hosts for this loop 26264 1727204267.56677: getting the next task for host managed-node3 26264 1727204267.56685: done getting next task for host managed-node3 26264 1727204267.56688: ^ task is: TASK: meta (role_complete) 26264 1727204267.56690: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204267.56702: getting variables 26264 1727204267.56704: in VariableManager get_vars() 26264 1727204267.56745: Calling all_inventory to load vars for managed-node3 26264 1727204267.56751: Calling groups_inventory to load vars for managed-node3 26264 1727204267.56754: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204267.56773: Calling all_plugins_play to load vars for managed-node3 26264 1727204267.56777: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204267.56781: Calling groups_plugins_play to load vars for managed-node3 26264 1727204267.58593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204267.60412: done with get_vars() 26264 1727204267.60442: done getting variables 26264 1727204267.60535: done queuing things up, now waiting for results queue to drain 26264 1727204267.60538: results queue empty 26264 1727204267.60539: checking for any_errors_fatal 26264 1727204267.60542: done checking for any_errors_fatal 26264 1727204267.60543: checking for max_fail_percentage 26264 1727204267.60545: done checking for max_fail_percentage 26264 1727204267.60546: checking to see if all hosts have failed and the running result is not ok 26264 1727204267.60547: done checking to see if all hosts have failed 26264 1727204267.60547: getting the remaining hosts for this loop 26264 1727204267.60550: done getting the remaining hosts for this loop 26264 1727204267.60554: getting the next task for host managed-node3 26264 1727204267.60557: done getting next task for host managed-node3 26264 1727204267.60559: ^ task is: TASK: meta (flush_handlers) 26264 1727204267.60565: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204267.60568: getting variables 26264 1727204267.60570: in VariableManager get_vars() 26264 1727204267.60585: Calling all_inventory to load vars for managed-node3 26264 1727204267.60587: Calling groups_inventory to load vars for managed-node3 26264 1727204267.60590: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204267.60595: Calling all_plugins_play to load vars for managed-node3 26264 1727204267.60598: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204267.60600: Calling groups_plugins_play to load vars for managed-node3 26264 1727204267.61962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204267.63784: done with get_vars() 26264 1727204267.63824: done getting variables 26264 1727204267.63888: in VariableManager get_vars() 26264 1727204267.63903: Calling all_inventory to load vars for managed-node3 26264 1727204267.63906: Calling groups_inventory to load vars for managed-node3 26264 1727204267.63908: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204267.63914: Calling all_plugins_play to load vars for managed-node3 26264 1727204267.63916: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204267.63925: Calling groups_plugins_play to load vars for managed-node3 26264 1727204267.65232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204267.67117: done with get_vars() 26264 1727204267.67151: done queuing things up, now waiting for results queue to drain 26264 1727204267.67154: results queue empty 26264 1727204267.67155: checking for any_errors_fatal 26264 1727204267.67156: done checking for any_errors_fatal 26264 1727204267.67157: checking for max_fail_percentage 26264 1727204267.67158: done checking for max_fail_percentage 26264 1727204267.67159: checking to see if all hosts have failed and the running result is not ok 26264 1727204267.67160: done checking to see if all hosts have failed 26264 1727204267.67161: getting the remaining hosts for this loop 26264 1727204267.67162: done getting the remaining hosts for this loop 26264 1727204267.67167: getting the next task for host managed-node3 26264 1727204267.67172: done getting next task for host managed-node3 26264 1727204267.67173: ^ task is: TASK: meta (flush_handlers) 26264 1727204267.67179: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204267.67182: getting variables 26264 1727204267.67183: in VariableManager get_vars() 26264 1727204267.67196: Calling all_inventory to load vars for managed-node3 26264 1727204267.67198: Calling groups_inventory to load vars for managed-node3 26264 1727204267.67200: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204267.67206: Calling all_plugins_play to load vars for managed-node3 26264 1727204267.67209: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204267.67211: Calling groups_plugins_play to load vars for managed-node3 26264 1727204267.68600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204267.70392: done with get_vars() 26264 1727204267.70418: done getting variables 26264 1727204267.70486: in VariableManager get_vars() 26264 1727204267.70503: Calling all_inventory to load vars for managed-node3 26264 1727204267.70506: Calling groups_inventory to load vars for managed-node3 26264 1727204267.70508: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204267.70513: Calling all_plugins_play to load vars for managed-node3 26264 1727204267.70516: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204267.70518: Calling groups_plugins_play to load vars for managed-node3 26264 1727204267.72146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204267.74181: done with get_vars() 26264 1727204267.74219: done queuing things up, now waiting for results queue to drain 26264 1727204267.74222: results queue empty 26264 1727204267.74223: checking for any_errors_fatal 26264 1727204267.74229: done checking for any_errors_fatal 26264 1727204267.74230: checking for max_fail_percentage 26264 1727204267.74231: done checking for max_fail_percentage 26264 1727204267.74232: checking to see if all hosts have failed and the running result is not ok 26264 1727204267.74233: done checking to see if all hosts have failed 26264 1727204267.74234: getting the remaining hosts for this loop 26264 1727204267.74235: done getting the remaining hosts for this loop 26264 1727204267.74238: getting the next task for host managed-node3 26264 1727204267.74242: done getting next task for host managed-node3 26264 1727204267.74243: ^ task is: None 26264 1727204267.74244: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204267.74246: done queuing things up, now waiting for results queue to drain 26264 1727204267.74247: results queue empty 26264 1727204267.74250: checking for any_errors_fatal 26264 1727204267.74251: done checking for any_errors_fatal 26264 1727204267.74252: checking for max_fail_percentage 26264 1727204267.74253: done checking for max_fail_percentage 26264 1727204267.74253: checking to see if all hosts have failed and the running result is not ok 26264 1727204267.74254: done checking to see if all hosts have failed 26264 1727204267.74255: getting the next task for host managed-node3 26264 1727204267.74258: done getting next task for host managed-node3 26264 1727204267.74259: ^ task is: None 26264 1727204267.74260: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204267.74313: in VariableManager get_vars() 26264 1727204267.74331: done with get_vars() 26264 1727204267.74343: in VariableManager get_vars() 26264 1727204267.74357: done with get_vars() 26264 1727204267.74362: variable 'omit' from source: magic vars 26264 1727204267.74396: in VariableManager get_vars() 26264 1727204267.74407: done with get_vars() 26264 1727204267.74431: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 26264 1727204267.74772: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 26264 1727204267.74796: getting the remaining hosts for this loop 26264 1727204267.74798: done getting the remaining hosts for this loop 26264 1727204267.74800: getting the next task for host managed-node3 26264 1727204267.74803: done getting next task for host managed-node3 26264 1727204267.74805: ^ task is: TASK: Gathering Facts 26264 1727204267.74807: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204267.74809: getting variables 26264 1727204267.74810: in VariableManager get_vars() 26264 1727204267.74818: Calling all_inventory to load vars for managed-node3 26264 1727204267.74822: Calling groups_inventory to load vars for managed-node3 26264 1727204267.74825: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204267.74831: Calling all_plugins_play to load vars for managed-node3 26264 1727204267.74834: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204267.74837: Calling groups_plugins_play to load vars for managed-node3 26264 1727204267.76174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204267.77936: done with get_vars() 26264 1727204267.77970: done getting variables 26264 1727204267.78024: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.572) 0:00:31.629 ***** 26264 1727204267.78062: entering _queue_task() for managed-node3/gather_facts 26264 1727204267.78409: worker is 1 (out of 1 available) 26264 1727204267.78421: exiting _queue_task() for managed-node3/gather_facts 26264 1727204267.78438: done queuing things up, now waiting for results queue to drain 26264 1727204267.78440: waiting for pending results... 26264 1727204267.78728: running TaskExecutor() for managed-node3/TASK: Gathering Facts 26264 1727204267.78852: in run() - task 0affcd87-79f5-5ff5-08b0-0000000003f8 26264 1727204267.78876: variable 'ansible_search_path' from source: unknown 26264 1727204267.78917: calling self._execute() 26264 1727204267.79018: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204267.79029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204267.79041: variable 'omit' from source: magic vars 26264 1727204267.79445: variable 'ansible_distribution_major_version' from source: facts 26264 1727204267.79469: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204267.79481: variable 'omit' from source: magic vars 26264 1727204267.79512: variable 'omit' from source: magic vars 26264 1727204267.79565: variable 'omit' from source: magic vars 26264 1727204267.79611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204267.79666: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204267.79693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204267.79716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204267.79733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204267.79781: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204267.79789: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204267.79797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204267.79909: Set connection var ansible_pipelining to False 26264 1727204267.79918: Set connection var ansible_connection to ssh 26264 1727204267.79925: Set connection var ansible_shell_type to sh 26264 1727204267.79936: Set connection var ansible_shell_executable to /bin/sh 26264 1727204267.79953: Set connection var ansible_timeout to 10 26264 1727204267.79974: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204267.80002: variable 'ansible_shell_executable' from source: unknown 26264 1727204267.80011: variable 'ansible_connection' from source: unknown 26264 1727204267.80018: variable 'ansible_module_compression' from source: unknown 26264 1727204267.80025: variable 'ansible_shell_type' from source: unknown 26264 1727204267.80033: variable 'ansible_shell_executable' from source: unknown 26264 1727204267.80039: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204267.80046: variable 'ansible_pipelining' from source: unknown 26264 1727204267.80056: variable 'ansible_timeout' from source: unknown 26264 1727204267.80065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204267.80261: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204267.80281: variable 'omit' from source: magic vars 26264 1727204267.80300: starting attempt loop 26264 1727204267.80308: running the handler 26264 1727204267.80328: variable 'ansible_facts' from source: unknown 26264 1727204267.80357: _low_level_execute_command(): starting 26264 1727204267.80374: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204267.81473: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204267.81498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.81517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.81537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.81592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.81610: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204267.81628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.81652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204267.81668: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204267.81682: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204267.81694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.81709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.81734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.81750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.81766: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204267.81781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.81868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204267.81893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204267.81911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204267.81997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204267.83612: stdout chunk (state=3): >>>/root <<< 26264 1727204267.83716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204267.83797: stderr chunk (state=3): >>><<< 26264 1727204267.83811: stdout chunk (state=3): >>><<< 26264 1727204267.83941: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204267.83945: _low_level_execute_command(): starting 26264 1727204267.83950: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204267.8384535-28988-30836375766616 `" && echo ansible-tmp-1727204267.8384535-28988-30836375766616="` echo /root/.ansible/tmp/ansible-tmp-1727204267.8384535-28988-30836375766616 `" ) && sleep 0' 26264 1727204267.84541: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204267.84560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.84582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.84601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.84644: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.84660: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204267.84680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.84699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204267.84711: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204267.84723: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204267.84736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.84753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.84773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.84787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.84800: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204267.84815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.84892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204267.84915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204267.84931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204267.85007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204267.86858: stdout chunk (state=3): >>>ansible-tmp-1727204267.8384535-28988-30836375766616=/root/.ansible/tmp/ansible-tmp-1727204267.8384535-28988-30836375766616 <<< 26264 1727204267.87078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204267.87082: stdout chunk (state=3): >>><<< 26264 1727204267.87085: stderr chunk (state=3): >>><<< 26264 1727204267.87271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204267.8384535-28988-30836375766616=/root/.ansible/tmp/ansible-tmp-1727204267.8384535-28988-30836375766616 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204267.87275: variable 'ansible_module_compression' from source: unknown 26264 1727204267.87277: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 26264 1727204267.87470: variable 'ansible_facts' from source: unknown 26264 1727204267.87474: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204267.8384535-28988-30836375766616/AnsiballZ_setup.py 26264 1727204267.87622: Sending initial data 26264 1727204267.87625: Sent initial data (153 bytes) 26264 1727204267.88614: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204267.88629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.88651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.88674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.88716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.88729: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204267.88743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.88772: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204267.88785: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204267.88797: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204267.88811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.88825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.88842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.88857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.88875: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204267.88890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.88968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204267.88996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204267.89013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204267.89093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204267.90800: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204267.90863: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204267.90899: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmp5v_q1ki5 /root/.ansible/tmp/ansible-tmp-1727204267.8384535-28988-30836375766616/AnsiballZ_setup.py <<< 26264 1727204267.90913: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204267.93526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204267.93531: stdout chunk (state=3): >>><<< 26264 1727204267.93534: stderr chunk (state=3): >>><<< 26264 1727204267.93537: done transferring module to remote 26264 1727204267.93540: _low_level_execute_command(): starting 26264 1727204267.93549: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204267.8384535-28988-30836375766616/ /root/.ansible/tmp/ansible-tmp-1727204267.8384535-28988-30836375766616/AnsiballZ_setup.py && sleep 0' 26264 1727204267.94131: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204267.94147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.94163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.94186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.94228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.94240: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204267.94254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.94277: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204267.94289: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204267.94300: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204267.94311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.94324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.94338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.94351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.94363: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204267.94379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.94456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204267.94481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204267.94498: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204267.94570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204267.96348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204267.96352: stdout chunk (state=3): >>><<< 26264 1727204267.96355: stderr chunk (state=3): >>><<< 26264 1727204267.96455: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204267.96459: _low_level_execute_command(): starting 26264 1727204267.96462: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204267.8384535-28988-30836375766616/AnsiballZ_setup.py && sleep 0' 26264 1727204267.97029: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204267.97041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.97054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.97073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.97113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.97123: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204267.97135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.97149: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204267.97159: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204267.97174: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204267.97185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204267.97198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204267.97210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204267.97219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204267.97229: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204267.97240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204267.97321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204267.97340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204267.97354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204267.97434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204268.49297: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.43, "5m": 0.37, "15m": 0.2}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "48", "epoch": "1727204268", "epoch_int": "1727204268", "date": "2024-09-24", "<<< 26264 1727204268.49344: stdout chunk (state=3): >>>time": "14:57:48", "iso8601_micro": "2024-09-24T18:57:48.213524Z", "iso8601": "2024-09-24T18:57:48Z", "iso8601_basic": "20240924T145748213524", "iso8601_basic_short": "20240924T145748", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["peerlsr27", "lo", "eth0", "lsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:14:6b:cc:14:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c014:6bff:fecc:144a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_seg<<< 26264 1727204268.49353: stdout chunk (state=3): >>>mentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "2a:10:ff:43:10:2c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b318:e6ab:cc76:c51b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_a<<< 26264 1727204268.49371: stdout chunk (state=3): >>>ddresses": ["fe80::c014:6bff:fecc:144a", "fe80::8ff:f5ff:fed7:be93", "fe80::b318:e6ab:cc76:c51b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93", "fe80::c014:6bff:fecc:144a"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2818, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 714, "free": 2818}, "nocache": {"free": 3278, "used": 254}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 614, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264279998464, "block_size": 4096, "block_total": 65519355, "block_available": 64521484, "block_used": 997871, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 26264 1727204268.51101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204268.51105: stdout chunk (state=3): >>><<< 26264 1727204268.51111: stderr chunk (state=3): >>><<< 26264 1727204268.51573: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.43, "5m": 0.37, "15m": 0.2}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "48", "epoch": "1727204268", "epoch_int": "1727204268", "date": "2024-09-24", "time": "14:57:48", "iso8601_micro": "2024-09-24T18:57:48.213524Z", "iso8601": "2024-09-24T18:57:48Z", "iso8601_basic": "20240924T145748213524", "iso8601_basic_short": "20240924T145748", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["peerlsr27", "lo", "eth0", "lsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:14:6b:cc:14:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c014:6bff:fecc:144a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "2a:10:ff:43:10:2c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b318:e6ab:cc76:c51b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::c014:6bff:fecc:144a", "fe80::8ff:f5ff:fed7:be93", "fe80::b318:e6ab:cc76:c51b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93", "fe80::c014:6bff:fecc:144a"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2818, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 714, "free": 2818}, "nocache": {"free": 3278, "used": 254}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 614, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264279998464, "block_size": 4096, "block_total": 65519355, "block_available": 64521484, "block_used": 997871, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204268.51801: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204267.8384535-28988-30836375766616/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204268.51832: _low_level_execute_command(): starting 26264 1727204268.51844: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204267.8384535-28988-30836375766616/ > /dev/null 2>&1 && sleep 0' 26264 1727204268.52547: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204268.52569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204268.52589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204268.52609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204268.52657: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204268.52673: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204268.52689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204268.52711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204268.52725: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204268.52731: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204268.52738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204268.52748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204268.52763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204268.52772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204268.52779: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204268.52788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204268.52867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204268.52887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204268.52902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204268.52984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204268.54847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204268.54850: stdout chunk (state=3): >>><<< 26264 1727204268.54853: stderr chunk (state=3): >>><<< 26264 1727204268.55470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204268.55473: handler run complete 26264 1727204268.55475: variable 'ansible_facts' from source: unknown 26264 1727204268.55477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204268.55479: variable 'ansible_facts' from source: unknown 26264 1727204268.55570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204268.55727: attempt loop complete, returning result 26264 1727204268.55737: _execute() done 26264 1727204268.55744: dumping result to json 26264 1727204268.55787: done dumping result, returning 26264 1727204268.55801: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-5ff5-08b0-0000000003f8] 26264 1727204268.55811: sending task result for task 0affcd87-79f5-5ff5-08b0-0000000003f8 ok: [managed-node3] 26264 1727204268.56718: no more pending results, returning what we have 26264 1727204268.56722: results queue empty 26264 1727204268.56723: checking for any_errors_fatal 26264 1727204268.56724: done checking for any_errors_fatal 26264 1727204268.56725: checking for max_fail_percentage 26264 1727204268.56726: done checking for max_fail_percentage 26264 1727204268.56727: checking to see if all hosts have failed and the running result is not ok 26264 1727204268.56729: done checking to see if all hosts have failed 26264 1727204268.56729: getting the remaining hosts for this loop 26264 1727204268.56731: done getting the remaining hosts for this loop 26264 1727204268.56736: getting the next task for host managed-node3 26264 1727204268.56742: done getting next task for host managed-node3 26264 1727204268.56744: ^ task is: TASK: meta (flush_handlers) 26264 1727204268.56746: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204268.56750: getting variables 26264 1727204268.56752: in VariableManager get_vars() 26264 1727204268.56781: Calling all_inventory to load vars for managed-node3 26264 1727204268.56784: Calling groups_inventory to load vars for managed-node3 26264 1727204268.56788: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204268.56800: Calling all_plugins_play to load vars for managed-node3 26264 1727204268.56803: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204268.56806: Calling groups_plugins_play to load vars for managed-node3 26264 1727204268.58087: done sending task result for task 0affcd87-79f5-5ff5-08b0-0000000003f8 26264 1727204268.58090: WORKER PROCESS EXITING 26264 1727204268.58499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204268.60160: done with get_vars() 26264 1727204268.60190: done getting variables 26264 1727204268.60267: in VariableManager get_vars() 26264 1727204268.60278: Calling all_inventory to load vars for managed-node3 26264 1727204268.60281: Calling groups_inventory to load vars for managed-node3 26264 1727204268.60284: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204268.60289: Calling all_plugins_play to load vars for managed-node3 26264 1727204268.60291: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204268.60300: Calling groups_plugins_play to load vars for managed-node3 26264 1727204268.61568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204268.63203: done with get_vars() 26264 1727204268.63239: done queuing things up, now waiting for results queue to drain 26264 1727204268.63242: results queue empty 26264 1727204268.63243: checking for any_errors_fatal 26264 1727204268.63248: done checking for any_errors_fatal 26264 1727204268.63249: checking for max_fail_percentage 26264 1727204268.63250: done checking for max_fail_percentage 26264 1727204268.63251: checking to see if all hosts have failed and the running result is not ok 26264 1727204268.63251: done checking to see if all hosts have failed 26264 1727204268.63252: getting the remaining hosts for this loop 26264 1727204268.63253: done getting the remaining hosts for this loop 26264 1727204268.63256: getting the next task for host managed-node3 26264 1727204268.63261: done getting next task for host managed-node3 26264 1727204268.63265: ^ task is: TASK: Include the task 'delete_interface.yml' 26264 1727204268.63266: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204268.63269: getting variables 26264 1727204268.63270: in VariableManager get_vars() 26264 1727204268.63281: Calling all_inventory to load vars for managed-node3 26264 1727204268.63283: Calling groups_inventory to load vars for managed-node3 26264 1727204268.63285: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204268.63291: Calling all_plugins_play to load vars for managed-node3 26264 1727204268.63294: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204268.63297: Calling groups_plugins_play to load vars for managed-node3 26264 1727204268.64519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204268.66159: done with get_vars() 26264 1727204268.66189: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.882) 0:00:32.511 ***** 26264 1727204268.66275: entering _queue_task() for managed-node3/include_tasks 26264 1727204268.66628: worker is 1 (out of 1 available) 26264 1727204268.66640: exiting _queue_task() for managed-node3/include_tasks 26264 1727204268.66654: done queuing things up, now waiting for results queue to drain 26264 1727204268.66656: waiting for pending results... 26264 1727204268.66938: running TaskExecutor() for managed-node3/TASK: Include the task 'delete_interface.yml' 26264 1727204268.67061: in run() - task 0affcd87-79f5-5ff5-08b0-000000000054 26264 1727204268.67082: variable 'ansible_search_path' from source: unknown 26264 1727204268.67125: calling self._execute() 26264 1727204268.67220: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204268.67231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204268.67243: variable 'omit' from source: magic vars 26264 1727204268.67614: variable 'ansible_distribution_major_version' from source: facts 26264 1727204268.67634: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204268.67646: _execute() done 26264 1727204268.67654: dumping result to json 26264 1727204268.67661: done dumping result, returning 26264 1727204268.67672: done running TaskExecutor() for managed-node3/TASK: Include the task 'delete_interface.yml' [0affcd87-79f5-5ff5-08b0-000000000054] 26264 1727204268.67684: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000054 26264 1727204268.67809: no more pending results, returning what we have 26264 1727204268.67814: in VariableManager get_vars() 26264 1727204268.67849: Calling all_inventory to load vars for managed-node3 26264 1727204268.67852: Calling groups_inventory to load vars for managed-node3 26264 1727204268.67856: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204268.67873: Calling all_plugins_play to load vars for managed-node3 26264 1727204268.67876: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204268.67879: Calling groups_plugins_play to load vars for managed-node3 26264 1727204268.69035: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000054 26264 1727204268.69039: WORKER PROCESS EXITING 26264 1727204268.74048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204268.75719: done with get_vars() 26264 1727204268.75749: variable 'ansible_search_path' from source: unknown 26264 1727204268.75768: we have included files to process 26264 1727204268.75770: generating all_blocks data 26264 1727204268.75771: done generating all_blocks data 26264 1727204268.75772: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 26264 1727204268.75773: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 26264 1727204268.75776: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 26264 1727204268.75983: done processing included file 26264 1727204268.75986: iterating over new_blocks loaded from include file 26264 1727204268.75987: in VariableManager get_vars() 26264 1727204268.76002: done with get_vars() 26264 1727204268.76004: filtering new block on tags 26264 1727204268.76019: done filtering new block on tags 26264 1727204268.76021: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node3 26264 1727204268.76027: extending task lists for all hosts with included blocks 26264 1727204268.76058: done extending task lists 26264 1727204268.76059: done processing included files 26264 1727204268.76060: results queue empty 26264 1727204268.76061: checking for any_errors_fatal 26264 1727204268.76063: done checking for any_errors_fatal 26264 1727204268.76065: checking for max_fail_percentage 26264 1727204268.76067: done checking for max_fail_percentage 26264 1727204268.76067: checking to see if all hosts have failed and the running result is not ok 26264 1727204268.76068: done checking to see if all hosts have failed 26264 1727204268.76069: getting the remaining hosts for this loop 26264 1727204268.76070: done getting the remaining hosts for this loop 26264 1727204268.76073: getting the next task for host managed-node3 26264 1727204268.76076: done getting next task for host managed-node3 26264 1727204268.76078: ^ task is: TASK: Remove test interface if necessary 26264 1727204268.76080: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204268.76083: getting variables 26264 1727204268.76084: in VariableManager get_vars() 26264 1727204268.76093: Calling all_inventory to load vars for managed-node3 26264 1727204268.76095: Calling groups_inventory to load vars for managed-node3 26264 1727204268.76097: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204268.76103: Calling all_plugins_play to load vars for managed-node3 26264 1727204268.76105: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204268.76108: Calling groups_plugins_play to load vars for managed-node3 26264 1727204268.77638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204268.79657: done with get_vars() 26264 1727204268.79683: done getting variables 26264 1727204268.79736: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.134) 0:00:32.646 ***** 26264 1727204268.79767: entering _queue_task() for managed-node3/command 26264 1727204268.80108: worker is 1 (out of 1 available) 26264 1727204268.80120: exiting _queue_task() for managed-node3/command 26264 1727204268.80132: done queuing things up, now waiting for results queue to drain 26264 1727204268.80134: waiting for pending results... 26264 1727204268.80420: running TaskExecutor() for managed-node3/TASK: Remove test interface if necessary 26264 1727204268.80662: in run() - task 0affcd87-79f5-5ff5-08b0-000000000409 26264 1727204268.80687: variable 'ansible_search_path' from source: unknown 26264 1727204268.80699: variable 'ansible_search_path' from source: unknown 26264 1727204268.80742: calling self._execute() 26264 1727204268.80845: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204268.80856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204268.80872: variable 'omit' from source: magic vars 26264 1727204268.81451: variable 'ansible_distribution_major_version' from source: facts 26264 1727204268.81476: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204268.81487: variable 'omit' from source: magic vars 26264 1727204268.81533: variable 'omit' from source: magic vars 26264 1727204268.81632: variable 'interface' from source: set_fact 26264 1727204268.81656: variable 'omit' from source: magic vars 26264 1727204268.81702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204268.81740: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204268.81772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204268.81799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204268.81816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204268.81850: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204268.81857: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204268.81865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204268.81958: Set connection var ansible_pipelining to False 26264 1727204268.81969: Set connection var ansible_connection to ssh 26264 1727204268.81976: Set connection var ansible_shell_type to sh 26264 1727204268.81987: Set connection var ansible_shell_executable to /bin/sh 26264 1727204268.82006: Set connection var ansible_timeout to 10 26264 1727204268.82018: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204268.82049: variable 'ansible_shell_executable' from source: unknown 26264 1727204268.82058: variable 'ansible_connection' from source: unknown 26264 1727204268.82066: variable 'ansible_module_compression' from source: unknown 26264 1727204268.82073: variable 'ansible_shell_type' from source: unknown 26264 1727204268.82079: variable 'ansible_shell_executable' from source: unknown 26264 1727204268.82086: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204268.82093: variable 'ansible_pipelining' from source: unknown 26264 1727204268.82098: variable 'ansible_timeout' from source: unknown 26264 1727204268.82108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204268.82250: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204268.82270: variable 'omit' from source: magic vars 26264 1727204268.82281: starting attempt loop 26264 1727204268.82288: running the handler 26264 1727204268.82309: _low_level_execute_command(): starting 26264 1727204268.82326: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204268.83098: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204268.83115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204268.83130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204268.83150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204268.83198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204268.83217: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204268.83311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204268.83333: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204268.83346: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204268.83358: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204268.83375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204268.83390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204268.83407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204268.83422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204268.83435: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204268.83449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204268.83524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204268.83555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204268.83580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204268.83661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204268.85259: stdout chunk (state=3): >>>/root <<< 26264 1727204268.85454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204268.85457: stdout chunk (state=3): >>><<< 26264 1727204268.85460: stderr chunk (state=3): >>><<< 26264 1727204268.85575: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204268.85578: _low_level_execute_command(): starting 26264 1727204268.85581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204268.854842-29030-256382185507401 `" && echo ansible-tmp-1727204268.854842-29030-256382185507401="` echo /root/.ansible/tmp/ansible-tmp-1727204268.854842-29030-256382185507401 `" ) && sleep 0' 26264 1727204268.86180: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204268.86195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204268.86211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204268.86233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204268.86280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204268.86293: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204268.86309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204268.86330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204268.86343: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204268.86355: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204268.86370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204268.86384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204268.86401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204268.86413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204268.86424: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204268.86442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204268.86518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204268.86536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204268.86556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204268.86637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204268.88451: stdout chunk (state=3): >>>ansible-tmp-1727204268.854842-29030-256382185507401=/root/.ansible/tmp/ansible-tmp-1727204268.854842-29030-256382185507401 <<< 26264 1727204268.88569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204268.88662: stderr chunk (state=3): >>><<< 26264 1727204268.88678: stdout chunk (state=3): >>><<< 26264 1727204268.88873: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204268.854842-29030-256382185507401=/root/.ansible/tmp/ansible-tmp-1727204268.854842-29030-256382185507401 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204268.88877: variable 'ansible_module_compression' from source: unknown 26264 1727204268.88879: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 26264 1727204268.88881: variable 'ansible_facts' from source: unknown 26264 1727204268.88929: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204268.854842-29030-256382185507401/AnsiballZ_command.py 26264 1727204268.89104: Sending initial data 26264 1727204268.89107: Sent initial data (155 bytes) 26264 1727204268.90172: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204268.90190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204268.90208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204268.90224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204268.90269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204268.90279: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204268.90290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204268.90313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204268.90324: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204268.90339: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204268.90355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204268.90372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204268.90390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204268.90405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204268.90423: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204268.90439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204268.90524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204268.90557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204268.90578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204268.90662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204268.92322: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204268.92407: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204268.92431: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpr5za6qtd /root/.ansible/tmp/ansible-tmp-1727204268.854842-29030-256382185507401/AnsiballZ_command.py <<< 26264 1727204268.92765: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204268.93651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204268.94270: stderr chunk (state=3): >>><<< 26264 1727204268.94274: stdout chunk (state=3): >>><<< 26264 1727204268.94396: done transferring module to remote 26264 1727204268.94403: _low_level_execute_command(): starting 26264 1727204268.94406: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204268.854842-29030-256382185507401/ /root/.ansible/tmp/ansible-tmp-1727204268.854842-29030-256382185507401/AnsiballZ_command.py && sleep 0' 26264 1727204268.95124: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204268.95128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204268.95169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204268.95173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204268.95185: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204268.95245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204268.95251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204268.96086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204268.96622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204268.98403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204268.98406: stdout chunk (state=3): >>><<< 26264 1727204268.98409: stderr chunk (state=3): >>><<< 26264 1727204268.98506: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204268.98510: _low_level_execute_command(): starting 26264 1727204268.98513: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204268.854842-29030-256382185507401/AnsiballZ_command.py && sleep 0' 26264 1727204268.99051: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204268.99072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204268.99090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204268.99110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204268.99154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204268.99172: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204268.99188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204268.99207: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204268.99219: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204268.99234: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204268.99246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204268.99260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204268.99284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204268.99296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204268.99308: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204268.99321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204268.99401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204268.99418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204268.99433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204268.99525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204269.13830: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-24 14:57:49.124535", "end": "2024-09-24 14:57:49.136832", "delta": "0:00:00.012297", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 26264 1727204269.13835: stdout chunk (state=3): >>> <<< 26264 1727204269.15247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204269.15329: stderr chunk (state=3): >>><<< 26264 1727204269.15333: stdout chunk (state=3): >>><<< 26264 1727204269.15374: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-24 14:57:49.124535", "end": "2024-09-24 14:57:49.136832", "delta": "0:00:00.012297", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204269.15497: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204268.854842-29030-256382185507401/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204269.15501: _low_level_execute_command(): starting 26264 1727204269.15503: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204268.854842-29030-256382185507401/ > /dev/null 2>&1 && sleep 0' 26264 1727204269.16385: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204269.16389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204269.16427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204269.16431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204269.16434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204269.16504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204269.16508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204269.16510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204269.16572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204269.18326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204269.18409: stderr chunk (state=3): >>><<< 26264 1727204269.18413: stdout chunk (state=3): >>><<< 26264 1727204269.18674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204269.18678: handler run complete 26264 1727204269.18681: Evaluated conditional (False): False 26264 1727204269.18683: attempt loop complete, returning result 26264 1727204269.18685: _execute() done 26264 1727204269.18687: dumping result to json 26264 1727204269.18689: done dumping result, returning 26264 1727204269.18691: done running TaskExecutor() for managed-node3/TASK: Remove test interface if necessary [0affcd87-79f5-5ff5-08b0-000000000409] 26264 1727204269.18693: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000409 26264 1727204269.18776: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000409 26264 1727204269.18780: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr27" ], "delta": "0:00:00.012297", "end": "2024-09-24 14:57:49.136832", "rc": 0, "start": "2024-09-24 14:57:49.124535" } 26264 1727204269.18852: no more pending results, returning what we have 26264 1727204269.18857: results queue empty 26264 1727204269.18858: checking for any_errors_fatal 26264 1727204269.18859: done checking for any_errors_fatal 26264 1727204269.18860: checking for max_fail_percentage 26264 1727204269.18862: done checking for max_fail_percentage 26264 1727204269.18863: checking to see if all hosts have failed and the running result is not ok 26264 1727204269.18866: done checking to see if all hosts have failed 26264 1727204269.18866: getting the remaining hosts for this loop 26264 1727204269.18869: done getting the remaining hosts for this loop 26264 1727204269.18873: getting the next task for host managed-node3 26264 1727204269.18881: done getting next task for host managed-node3 26264 1727204269.18884: ^ task is: TASK: meta (flush_handlers) 26264 1727204269.18886: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204269.18891: getting variables 26264 1727204269.18892: in VariableManager get_vars() 26264 1727204269.18922: Calling all_inventory to load vars for managed-node3 26264 1727204269.18925: Calling groups_inventory to load vars for managed-node3 26264 1727204269.18928: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204269.18940: Calling all_plugins_play to load vars for managed-node3 26264 1727204269.18942: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204269.18945: Calling groups_plugins_play to load vars for managed-node3 26264 1727204269.21484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204269.24013: done with get_vars() 26264 1727204269.24051: done getting variables 26264 1727204269.24128: in VariableManager get_vars() 26264 1727204269.24139: Calling all_inventory to load vars for managed-node3 26264 1727204269.24142: Calling groups_inventory to load vars for managed-node3 26264 1727204269.24144: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204269.24153: Calling all_plugins_play to load vars for managed-node3 26264 1727204269.24155: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204269.24158: Calling groups_plugins_play to load vars for managed-node3 26264 1727204269.26631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204269.29735: done with get_vars() 26264 1727204269.29780: done queuing things up, now waiting for results queue to drain 26264 1727204269.29783: results queue empty 26264 1727204269.29784: checking for any_errors_fatal 26264 1727204269.29788: done checking for any_errors_fatal 26264 1727204269.29789: checking for max_fail_percentage 26264 1727204269.29790: done checking for max_fail_percentage 26264 1727204269.29791: checking to see if all hosts have failed and the running result is not ok 26264 1727204269.29798: done checking to see if all hosts have failed 26264 1727204269.29799: getting the remaining hosts for this loop 26264 1727204269.29800: done getting the remaining hosts for this loop 26264 1727204269.29804: getting the next task for host managed-node3 26264 1727204269.29809: done getting next task for host managed-node3 26264 1727204269.29811: ^ task is: TASK: meta (flush_handlers) 26264 1727204269.29812: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204269.29815: getting variables 26264 1727204269.29816: in VariableManager get_vars() 26264 1727204269.29828: Calling all_inventory to load vars for managed-node3 26264 1727204269.29830: Calling groups_inventory to load vars for managed-node3 26264 1727204269.29832: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204269.29838: Calling all_plugins_play to load vars for managed-node3 26264 1727204269.29841: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204269.29844: Calling groups_plugins_play to load vars for managed-node3 26264 1727204269.32389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204269.34345: done with get_vars() 26264 1727204269.34377: done getting variables 26264 1727204269.34431: in VariableManager get_vars() 26264 1727204269.34441: Calling all_inventory to load vars for managed-node3 26264 1727204269.34444: Calling groups_inventory to load vars for managed-node3 26264 1727204269.34446: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204269.34453: Calling all_plugins_play to load vars for managed-node3 26264 1727204269.34456: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204269.34459: Calling groups_plugins_play to load vars for managed-node3 26264 1727204269.37539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204269.40767: done with get_vars() 26264 1727204269.40803: done queuing things up, now waiting for results queue to drain 26264 1727204269.40806: results queue empty 26264 1727204269.40807: checking for any_errors_fatal 26264 1727204269.40808: done checking for any_errors_fatal 26264 1727204269.40809: checking for max_fail_percentage 26264 1727204269.40810: done checking for max_fail_percentage 26264 1727204269.40811: checking to see if all hosts have failed and the running result is not ok 26264 1727204269.40812: done checking to see if all hosts have failed 26264 1727204269.40812: getting the remaining hosts for this loop 26264 1727204269.40813: done getting the remaining hosts for this loop 26264 1727204269.40816: getting the next task for host managed-node3 26264 1727204269.40820: done getting next task for host managed-node3 26264 1727204269.40821: ^ task is: None 26264 1727204269.40822: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204269.40823: done queuing things up, now waiting for results queue to drain 26264 1727204269.40824: results queue empty 26264 1727204269.40825: checking for any_errors_fatal 26264 1727204269.40825: done checking for any_errors_fatal 26264 1727204269.40827: checking for max_fail_percentage 26264 1727204269.40828: done checking for max_fail_percentage 26264 1727204269.40829: checking to see if all hosts have failed and the running result is not ok 26264 1727204269.40830: done checking to see if all hosts have failed 26264 1727204269.40831: getting the next task for host managed-node3 26264 1727204269.40833: done getting next task for host managed-node3 26264 1727204269.40834: ^ task is: None 26264 1727204269.40835: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204269.41273: in VariableManager get_vars() 26264 1727204269.41297: done with get_vars() 26264 1727204269.41303: in VariableManager get_vars() 26264 1727204269.41315: done with get_vars() 26264 1727204269.41320: variable 'omit' from source: magic vars 26264 1727204269.41442: variable 'profile' from source: play vars 26264 1727204269.41750: in VariableManager get_vars() 26264 1727204269.41769: done with get_vars() 26264 1727204269.41793: variable 'omit' from source: magic vars 26264 1727204269.41861: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 26264 1727204269.43137: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 26264 1727204269.43302: getting the remaining hosts for this loop 26264 1727204269.43304: done getting the remaining hosts for this loop 26264 1727204269.43306: getting the next task for host managed-node3 26264 1727204269.43310: done getting next task for host managed-node3 26264 1727204269.43312: ^ task is: TASK: Gathering Facts 26264 1727204269.43314: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204269.43316: getting variables 26264 1727204269.43317: in VariableManager get_vars() 26264 1727204269.43330: Calling all_inventory to load vars for managed-node3 26264 1727204269.43332: Calling groups_inventory to load vars for managed-node3 26264 1727204269.43334: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204269.43340: Calling all_plugins_play to load vars for managed-node3 26264 1727204269.43342: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204269.43345: Calling groups_plugins_play to load vars for managed-node3 26264 1727204269.46025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204269.49475: done with get_vars() 26264 1727204269.49499: done getting variables 26264 1727204269.49553: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.698) 0:00:33.344 ***** 26264 1727204269.49583: entering _queue_task() for managed-node3/gather_facts 26264 1727204269.49916: worker is 1 (out of 1 available) 26264 1727204269.49928: exiting _queue_task() for managed-node3/gather_facts 26264 1727204269.49941: done queuing things up, now waiting for results queue to drain 26264 1727204269.49943: waiting for pending results... 26264 1727204269.50944: running TaskExecutor() for managed-node3/TASK: Gathering Facts 26264 1727204269.51155: in run() - task 0affcd87-79f5-5ff5-08b0-000000000417 26264 1727204269.51243: variable 'ansible_search_path' from source: unknown 26264 1727204269.51284: calling self._execute() 26264 1727204269.51502: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204269.51514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204269.51569: variable 'omit' from source: magic vars 26264 1727204269.52757: variable 'ansible_distribution_major_version' from source: facts 26264 1727204269.52780: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204269.52791: variable 'omit' from source: magic vars 26264 1727204269.52823: variable 'omit' from source: magic vars 26264 1727204269.52867: variable 'omit' from source: magic vars 26264 1727204269.52920: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204269.52967: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204269.53000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204269.53030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204269.53052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204269.53089: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204269.53101: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204269.53108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204269.53213: Set connection var ansible_pipelining to False 26264 1727204269.53220: Set connection var ansible_connection to ssh 26264 1727204269.53226: Set connection var ansible_shell_type to sh 26264 1727204269.53238: Set connection var ansible_shell_executable to /bin/sh 26264 1727204269.53252: Set connection var ansible_timeout to 10 26264 1727204269.53262: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204269.53290: variable 'ansible_shell_executable' from source: unknown 26264 1727204269.53296: variable 'ansible_connection' from source: unknown 26264 1727204269.53301: variable 'ansible_module_compression' from source: unknown 26264 1727204269.53306: variable 'ansible_shell_type' from source: unknown 26264 1727204269.53316: variable 'ansible_shell_executable' from source: unknown 26264 1727204269.53321: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204269.53327: variable 'ansible_pipelining' from source: unknown 26264 1727204269.53332: variable 'ansible_timeout' from source: unknown 26264 1727204269.53341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204269.53521: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204269.53542: variable 'omit' from source: magic vars 26264 1727204269.53557: starting attempt loop 26264 1727204269.53563: running the handler 26264 1727204269.53583: variable 'ansible_facts' from source: unknown 26264 1727204269.53602: _low_level_execute_command(): starting 26264 1727204269.53613: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204269.54423: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204269.54441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204269.54459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204269.54482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204269.54532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204269.54547: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204269.54566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204269.54585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204269.54596: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204269.54606: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204269.54621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204269.54637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204269.54660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204269.54674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204269.54685: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204269.54697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204269.54782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204269.54803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204269.54816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204269.54902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204269.56504: stdout chunk (state=3): >>>/root <<< 26264 1727204269.56698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204269.56701: stdout chunk (state=3): >>><<< 26264 1727204269.56704: stderr chunk (state=3): >>><<< 26264 1727204269.56770: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204269.56774: _low_level_execute_command(): starting 26264 1727204269.56778: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204269.5672371-29058-86490545366503 `" && echo ansible-tmp-1727204269.5672371-29058-86490545366503="` echo /root/.ansible/tmp/ansible-tmp-1727204269.5672371-29058-86490545366503 `" ) && sleep 0' 26264 1727204269.57809: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204269.57825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204269.57841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204269.57860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204269.57931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204269.57945: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204269.57960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204269.57982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204269.57994: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204269.58005: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204269.58017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204269.58032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204269.58056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204269.58084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204269.58098: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204269.58119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204269.58209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204269.58226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204269.58248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204269.58437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204269.60257: stdout chunk (state=3): >>>ansible-tmp-1727204269.5672371-29058-86490545366503=/root/.ansible/tmp/ansible-tmp-1727204269.5672371-29058-86490545366503 <<< 26264 1727204269.60380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204269.60473: stderr chunk (state=3): >>><<< 26264 1727204269.60476: stdout chunk (state=3): >>><<< 26264 1727204269.60570: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204269.5672371-29058-86490545366503=/root/.ansible/tmp/ansible-tmp-1727204269.5672371-29058-86490545366503 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204269.60574: variable 'ansible_module_compression' from source: unknown 26264 1727204269.60769: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 26264 1727204269.60772: variable 'ansible_facts' from source: unknown 26264 1727204269.60835: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204269.5672371-29058-86490545366503/AnsiballZ_setup.py 26264 1727204269.61505: Sending initial data 26264 1727204269.61508: Sent initial data (153 bytes) 26264 1727204269.63813: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204269.63892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204269.63910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204269.63929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204269.63976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204269.64481: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204269.64499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204269.64518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204269.64531: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204269.64544: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204269.64559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204269.64576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204269.64592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204269.64605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204269.64616: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204269.64631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204269.64711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204269.64729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204269.64744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204269.64820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204269.66500: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204269.66541: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204269.66586: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmp3zqmim3w /root/.ansible/tmp/ansible-tmp-1727204269.5672371-29058-86490545366503/AnsiballZ_setup.py <<< 26264 1727204269.66625: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204269.69411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204269.69677: stderr chunk (state=3): >>><<< 26264 1727204269.69680: stdout chunk (state=3): >>><<< 26264 1727204269.69683: done transferring module to remote 26264 1727204269.69685: _low_level_execute_command(): starting 26264 1727204269.69692: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204269.5672371-29058-86490545366503/ /root/.ansible/tmp/ansible-tmp-1727204269.5672371-29058-86490545366503/AnsiballZ_setup.py && sleep 0' 26264 1727204269.70674: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204269.70689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204269.70703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204269.70720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204269.70767: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204269.70782: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204269.70796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204269.70812: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204269.70822: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204269.70833: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204269.70844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204269.70859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204269.70876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204269.70890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204269.70900: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204269.70914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204269.70997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204269.71020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204269.71034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204269.71110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204269.72893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204269.72896: stdout chunk (state=3): >>><<< 26264 1727204269.72898: stderr chunk (state=3): >>><<< 26264 1727204269.72995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204269.72998: _low_level_execute_command(): starting 26264 1727204269.73001: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204269.5672371-29058-86490545366503/AnsiballZ_setup.py && sleep 0' 26264 1727204269.73824: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204269.73828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204269.73871: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204269.73875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204269.73878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204269.73930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204269.74266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204269.74277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204269.74353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204270.24076: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "49", "epoch": "1727204269", "epoch_int": "1727204269", "date": "2024-09-24", "time": "14:57:49", "iso8601_micro": "2024-09-24T18:57:49.966851Z", "iso8601": "2024-09-24T18:57:49Z", "iso8601_basic": "20240924T145749966851", "iso8601_basic_short": "20240924T145749", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.47, "5m": 0.38, "15m": 0.2}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2817, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 715, "free": 2817}, "nocache": {"free": 3277, "used": 255}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 616, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264280018944, "block_size": 4096, "block_total": 65519355, "block_available": 64521489, "block_used": 997866, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 26264 1727204270.25814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204270.25818: stdout chunk (state=3): >>><<< 26264 1727204270.25825: stderr chunk (state=3): >>><<< 26264 1727204270.25867: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "49", "epoch": "1727204269", "epoch_int": "1727204269", "date": "2024-09-24", "time": "14:57:49", "iso8601_micro": "2024-09-24T18:57:49.966851Z", "iso8601": "2024-09-24T18:57:49Z", "iso8601_basic": "20240924T145749966851", "iso8601_basic_short": "20240924T145749", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.47, "5m": 0.38, "15m": 0.2}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2817, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 715, "free": 2817}, "nocache": {"free": 3277, "used": 255}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 616, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264280018944, "block_size": 4096, "block_total": 65519355, "block_available": 64521489, "block_used": 997866, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204270.26229: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204269.5672371-29058-86490545366503/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204270.26252: _low_level_execute_command(): starting 26264 1727204270.26255: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204269.5672371-29058-86490545366503/ > /dev/null 2>&1 && sleep 0' 26264 1727204270.28032: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204270.28036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204270.28124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204270.28129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204270.28132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204270.28292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204270.28296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204270.28317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204270.28383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204270.30166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204270.30243: stderr chunk (state=3): >>><<< 26264 1727204270.30246: stdout chunk (state=3): >>><<< 26264 1727204270.30572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204270.30575: handler run complete 26264 1727204270.30579: variable 'ansible_facts' from source: unknown 26264 1727204270.30581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204270.30805: variable 'ansible_facts' from source: unknown 26264 1727204270.31009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204270.31256: attempt loop complete, returning result 26264 1727204270.31268: _execute() done 26264 1727204270.31275: dumping result to json 26264 1727204270.31309: done dumping result, returning 26264 1727204270.31322: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-5ff5-08b0-000000000417] 26264 1727204270.31409: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000417 ok: [managed-node3] 26264 1727204270.32982: no more pending results, returning what we have 26264 1727204270.32985: results queue empty 26264 1727204270.32986: checking for any_errors_fatal 26264 1727204270.32987: done checking for any_errors_fatal 26264 1727204270.32988: checking for max_fail_percentage 26264 1727204270.32989: done checking for max_fail_percentage 26264 1727204270.32990: checking to see if all hosts have failed and the running result is not ok 26264 1727204270.32991: done checking to see if all hosts have failed 26264 1727204270.32992: getting the remaining hosts for this loop 26264 1727204270.32994: done getting the remaining hosts for this loop 26264 1727204270.32998: getting the next task for host managed-node3 26264 1727204270.33004: done getting next task for host managed-node3 26264 1727204270.33005: ^ task is: TASK: meta (flush_handlers) 26264 1727204270.33008: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204270.33011: getting variables 26264 1727204270.33013: in VariableManager get_vars() 26264 1727204270.33045: Calling all_inventory to load vars for managed-node3 26264 1727204270.33050: Calling groups_inventory to load vars for managed-node3 26264 1727204270.33052: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204270.33070: Calling all_plugins_play to load vars for managed-node3 26264 1727204270.33073: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204270.33081: Calling groups_plugins_play to load vars for managed-node3 26264 1727204270.33698: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000417 26264 1727204270.33701: WORKER PROCESS EXITING 26264 1727204270.35636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204270.40566: done with get_vars() 26264 1727204270.40598: done getting variables 26264 1727204270.40672: in VariableManager get_vars() 26264 1727204270.40685: Calling all_inventory to load vars for managed-node3 26264 1727204270.40688: Calling groups_inventory to load vars for managed-node3 26264 1727204270.40690: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204270.40694: Calling all_plugins_play to load vars for managed-node3 26264 1727204270.40696: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204270.40699: Calling groups_plugins_play to load vars for managed-node3 26264 1727204270.42190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204270.44586: done with get_vars() 26264 1727204270.44621: done queuing things up, now waiting for results queue to drain 26264 1727204270.44624: results queue empty 26264 1727204270.44625: checking for any_errors_fatal 26264 1727204270.44629: done checking for any_errors_fatal 26264 1727204270.44630: checking for max_fail_percentage 26264 1727204270.44631: done checking for max_fail_percentage 26264 1727204270.44636: checking to see if all hosts have failed and the running result is not ok 26264 1727204270.44637: done checking to see if all hosts have failed 26264 1727204270.44637: getting the remaining hosts for this loop 26264 1727204270.44638: done getting the remaining hosts for this loop 26264 1727204270.44642: getting the next task for host managed-node3 26264 1727204270.44646: done getting next task for host managed-node3 26264 1727204270.44651: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 26264 1727204270.44653: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204270.44666: getting variables 26264 1727204270.44667: in VariableManager get_vars() 26264 1727204270.44797: Calling all_inventory to load vars for managed-node3 26264 1727204270.44800: Calling groups_inventory to load vars for managed-node3 26264 1727204270.44802: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204270.44808: Calling all_plugins_play to load vars for managed-node3 26264 1727204270.44810: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204270.44813: Calling groups_plugins_play to load vars for managed-node3 26264 1727204270.47491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204270.51840: done with get_vars() 26264 1727204270.51868: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:50 -0400 (0:00:01.023) 0:00:34.368 ***** 26264 1727204270.51958: entering _queue_task() for managed-node3/include_tasks 26264 1727204270.52376: worker is 1 (out of 1 available) 26264 1727204270.52390: exiting _queue_task() for managed-node3/include_tasks 26264 1727204270.52402: done queuing things up, now waiting for results queue to drain 26264 1727204270.52404: waiting for pending results... 26264 1727204270.52706: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 26264 1727204270.52882: in run() - task 0affcd87-79f5-5ff5-08b0-00000000005c 26264 1727204270.52902: variable 'ansible_search_path' from source: unknown 26264 1727204270.52909: variable 'ansible_search_path' from source: unknown 26264 1727204270.52946: calling self._execute() 26264 1727204270.53044: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204270.53060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204270.53082: variable 'omit' from source: magic vars 26264 1727204270.53490: variable 'ansible_distribution_major_version' from source: facts 26264 1727204270.53513: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204270.53524: _execute() done 26264 1727204270.53532: dumping result to json 26264 1727204270.53539: done dumping result, returning 26264 1727204270.53554: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-5ff5-08b0-00000000005c] 26264 1727204270.53625: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000005c 26264 1727204270.53778: no more pending results, returning what we have 26264 1727204270.53784: in VariableManager get_vars() 26264 1727204270.53829: Calling all_inventory to load vars for managed-node3 26264 1727204270.53833: Calling groups_inventory to load vars for managed-node3 26264 1727204270.53836: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204270.53852: Calling all_plugins_play to load vars for managed-node3 26264 1727204270.53855: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204270.53859: Calling groups_plugins_play to load vars for managed-node3 26264 1727204270.54885: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000005c 26264 1727204270.54889: WORKER PROCESS EXITING 26264 1727204270.56295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204270.57953: done with get_vars() 26264 1727204270.57983: variable 'ansible_search_path' from source: unknown 26264 1727204270.57984: variable 'ansible_search_path' from source: unknown 26264 1727204270.58140: we have included files to process 26264 1727204270.58142: generating all_blocks data 26264 1727204270.58143: done generating all_blocks data 26264 1727204270.58144: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26264 1727204270.58145: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26264 1727204270.58148: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26264 1727204270.58748: done processing included file 26264 1727204270.58750: iterating over new_blocks loaded from include file 26264 1727204270.58752: in VariableManager get_vars() 26264 1727204270.58776: done with get_vars() 26264 1727204270.58778: filtering new block on tags 26264 1727204270.58795: done filtering new block on tags 26264 1727204270.58797: in VariableManager get_vars() 26264 1727204270.58815: done with get_vars() 26264 1727204270.58817: filtering new block on tags 26264 1727204270.58835: done filtering new block on tags 26264 1727204270.58838: in VariableManager get_vars() 26264 1727204270.58857: done with get_vars() 26264 1727204270.58859: filtering new block on tags 26264 1727204270.58876: done filtering new block on tags 26264 1727204270.58879: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 26264 1727204270.58885: extending task lists for all hosts with included blocks 26264 1727204270.59967: done extending task lists 26264 1727204270.59969: done processing included files 26264 1727204270.59970: results queue empty 26264 1727204270.59970: checking for any_errors_fatal 26264 1727204270.59972: done checking for any_errors_fatal 26264 1727204270.59973: checking for max_fail_percentage 26264 1727204270.59974: done checking for max_fail_percentage 26264 1727204270.59975: checking to see if all hosts have failed and the running result is not ok 26264 1727204270.59976: done checking to see if all hosts have failed 26264 1727204270.59977: getting the remaining hosts for this loop 26264 1727204270.59978: done getting the remaining hosts for this loop 26264 1727204270.59980: getting the next task for host managed-node3 26264 1727204270.59985: done getting next task for host managed-node3 26264 1727204270.59987: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 26264 1727204270.59990: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204270.59999: getting variables 26264 1727204270.60000: in VariableManager get_vars() 26264 1727204270.60016: Calling all_inventory to load vars for managed-node3 26264 1727204270.60019: Calling groups_inventory to load vars for managed-node3 26264 1727204270.60020: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204270.60026: Calling all_plugins_play to load vars for managed-node3 26264 1727204270.60030: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204270.60033: Calling groups_plugins_play to load vars for managed-node3 26264 1727204270.62790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204270.65819: done with get_vars() 26264 1727204270.65852: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.139) 0:00:34.508 ***** 26264 1727204270.65939: entering _queue_task() for managed-node3/setup 26264 1727204270.66987: worker is 1 (out of 1 available) 26264 1727204270.67000: exiting _queue_task() for managed-node3/setup 26264 1727204270.67012: done queuing things up, now waiting for results queue to drain 26264 1727204270.67013: waiting for pending results... 26264 1727204270.68006: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 26264 1727204270.68273: in run() - task 0affcd87-79f5-5ff5-08b0-000000000458 26264 1727204270.68296: variable 'ansible_search_path' from source: unknown 26264 1727204270.68374: variable 'ansible_search_path' from source: unknown 26264 1727204270.68418: calling self._execute() 26264 1727204270.68626: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204270.68638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204270.68657: variable 'omit' from source: magic vars 26264 1727204270.69528: variable 'ansible_distribution_major_version' from source: facts 26264 1727204270.69550: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204270.70116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204270.75281: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204270.75555: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204270.75602: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204270.75758: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204270.75794: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204270.75995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204270.76032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204270.76074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204270.76122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204270.76301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204270.76360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204270.76392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204270.76428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204270.76626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204270.76646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204270.76929: variable '__network_required_facts' from source: role '' defaults 26264 1727204270.76959: variable 'ansible_facts' from source: unknown 26264 1727204270.78624: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 26264 1727204270.78677: when evaluation is False, skipping this task 26264 1727204270.78689: _execute() done 26264 1727204270.78697: dumping result to json 26264 1727204270.78706: done dumping result, returning 26264 1727204270.78719: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-5ff5-08b0-000000000458] 26264 1727204270.78779: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000458 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204270.78941: no more pending results, returning what we have 26264 1727204270.78946: results queue empty 26264 1727204270.78947: checking for any_errors_fatal 26264 1727204270.78948: done checking for any_errors_fatal 26264 1727204270.78949: checking for max_fail_percentage 26264 1727204270.78950: done checking for max_fail_percentage 26264 1727204270.78951: checking to see if all hosts have failed and the running result is not ok 26264 1727204270.78952: done checking to see if all hosts have failed 26264 1727204270.78953: getting the remaining hosts for this loop 26264 1727204270.78954: done getting the remaining hosts for this loop 26264 1727204270.78959: getting the next task for host managed-node3 26264 1727204270.78969: done getting next task for host managed-node3 26264 1727204270.78973: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 26264 1727204270.78976: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204270.78990: getting variables 26264 1727204270.78992: in VariableManager get_vars() 26264 1727204270.79031: Calling all_inventory to load vars for managed-node3 26264 1727204270.79034: Calling groups_inventory to load vars for managed-node3 26264 1727204270.79036: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204270.79048: Calling all_plugins_play to load vars for managed-node3 26264 1727204270.79051: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204270.79054: Calling groups_plugins_play to load vars for managed-node3 26264 1727204270.79818: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000458 26264 1727204270.79822: WORKER PROCESS EXITING 26264 1727204270.80974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204270.83653: done with get_vars() 26264 1727204270.83687: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.178) 0:00:34.687 ***** 26264 1727204270.83818: entering _queue_task() for managed-node3/stat 26264 1727204270.84201: worker is 1 (out of 1 available) 26264 1727204270.84215: exiting _queue_task() for managed-node3/stat 26264 1727204270.84228: done queuing things up, now waiting for results queue to drain 26264 1727204270.84230: waiting for pending results... 26264 1727204270.84539: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 26264 1727204270.84684: in run() - task 0affcd87-79f5-5ff5-08b0-00000000045a 26264 1727204270.84698: variable 'ansible_search_path' from source: unknown 26264 1727204270.84701: variable 'ansible_search_path' from source: unknown 26264 1727204270.84743: calling self._execute() 26264 1727204270.84852: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204270.84860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204270.84872: variable 'omit' from source: magic vars 26264 1727204270.85294: variable 'ansible_distribution_major_version' from source: facts 26264 1727204270.85307: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204270.85501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204270.85809: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204270.85855: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204270.85894: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204270.85936: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204270.86030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204270.86057: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204270.86090: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204270.86121: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204270.86216: variable '__network_is_ostree' from source: set_fact 26264 1727204270.86227: Evaluated conditional (not __network_is_ostree is defined): False 26264 1727204270.86230: when evaluation is False, skipping this task 26264 1727204270.86237: _execute() done 26264 1727204270.86240: dumping result to json 26264 1727204270.86243: done dumping result, returning 26264 1727204270.86256: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-5ff5-08b0-00000000045a] 26264 1727204270.86261: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000045a skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 26264 1727204270.86402: no more pending results, returning what we have 26264 1727204270.86407: results queue empty 26264 1727204270.86408: checking for any_errors_fatal 26264 1727204270.86413: done checking for any_errors_fatal 26264 1727204270.86414: checking for max_fail_percentage 26264 1727204270.86415: done checking for max_fail_percentage 26264 1727204270.86416: checking to see if all hosts have failed and the running result is not ok 26264 1727204270.86417: done checking to see if all hosts have failed 26264 1727204270.86418: getting the remaining hosts for this loop 26264 1727204270.86420: done getting the remaining hosts for this loop 26264 1727204270.86424: getting the next task for host managed-node3 26264 1727204270.86433: done getting next task for host managed-node3 26264 1727204270.86438: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 26264 1727204270.86441: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204270.86463: getting variables 26264 1727204270.86472: in VariableManager get_vars() 26264 1727204270.86514: Calling all_inventory to load vars for managed-node3 26264 1727204270.86517: Calling groups_inventory to load vars for managed-node3 26264 1727204270.86520: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204270.86533: Calling all_plugins_play to load vars for managed-node3 26264 1727204270.86537: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204270.86541: Calling groups_plugins_play to load vars for managed-node3 26264 1727204270.87075: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000045a 26264 1727204270.87079: WORKER PROCESS EXITING 26264 1727204270.88500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204270.90422: done with get_vars() 26264 1727204270.91187: done getting variables 26264 1727204270.91369: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.075) 0:00:34.763 ***** 26264 1727204270.91403: entering _queue_task() for managed-node3/set_fact 26264 1727204270.92089: worker is 1 (out of 1 available) 26264 1727204270.92101: exiting _queue_task() for managed-node3/set_fact 26264 1727204270.92114: done queuing things up, now waiting for results queue to drain 26264 1727204270.92116: waiting for pending results... 26264 1727204270.92619: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 26264 1727204270.92784: in run() - task 0affcd87-79f5-5ff5-08b0-00000000045b 26264 1727204270.92799: variable 'ansible_search_path' from source: unknown 26264 1727204270.92803: variable 'ansible_search_path' from source: unknown 26264 1727204270.92850: calling self._execute() 26264 1727204270.92962: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204270.92970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204270.92982: variable 'omit' from source: magic vars 26264 1727204270.93408: variable 'ansible_distribution_major_version' from source: facts 26264 1727204270.93422: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204270.93625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204270.93930: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204270.93990: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204270.94027: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204270.94083: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204270.94184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204270.94210: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204270.94236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204270.94282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204270.94458: variable '__network_is_ostree' from source: set_fact 26264 1727204270.94462: Evaluated conditional (not __network_is_ostree is defined): False 26264 1727204270.94467: when evaluation is False, skipping this task 26264 1727204270.94470: _execute() done 26264 1727204270.94472: dumping result to json 26264 1727204270.94475: done dumping result, returning 26264 1727204270.94477: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-5ff5-08b0-00000000045b] 26264 1727204270.94479: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000045b 26264 1727204270.94539: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000045b 26264 1727204270.94543: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 26264 1727204270.94596: no more pending results, returning what we have 26264 1727204270.94601: results queue empty 26264 1727204270.94602: checking for any_errors_fatal 26264 1727204270.94610: done checking for any_errors_fatal 26264 1727204270.94611: checking for max_fail_percentage 26264 1727204270.94613: done checking for max_fail_percentage 26264 1727204270.94614: checking to see if all hosts have failed and the running result is not ok 26264 1727204270.94615: done checking to see if all hosts have failed 26264 1727204270.94616: getting the remaining hosts for this loop 26264 1727204270.94618: done getting the remaining hosts for this loop 26264 1727204270.94622: getting the next task for host managed-node3 26264 1727204270.94633: done getting next task for host managed-node3 26264 1727204270.94637: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 26264 1727204270.94641: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204270.94660: getting variables 26264 1727204270.94662: in VariableManager get_vars() 26264 1727204270.94707: Calling all_inventory to load vars for managed-node3 26264 1727204270.94711: Calling groups_inventory to load vars for managed-node3 26264 1727204270.94714: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204270.94727: Calling all_plugins_play to load vars for managed-node3 26264 1727204270.94730: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204270.94734: Calling groups_plugins_play to load vars for managed-node3 26264 1727204270.97403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204271.00554: done with get_vars() 26264 1727204271.00584: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.092) 0:00:34.856 ***** 26264 1727204271.00704: entering _queue_task() for managed-node3/service_facts 26264 1727204271.01073: worker is 1 (out of 1 available) 26264 1727204271.01093: exiting _queue_task() for managed-node3/service_facts 26264 1727204271.01104: done queuing things up, now waiting for results queue to drain 26264 1727204271.01106: waiting for pending results... 26264 1727204271.01408: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 26264 1727204271.01561: in run() - task 0affcd87-79f5-5ff5-08b0-00000000045d 26264 1727204271.01577: variable 'ansible_search_path' from source: unknown 26264 1727204271.01581: variable 'ansible_search_path' from source: unknown 26264 1727204271.01616: calling self._execute() 26264 1727204271.01720: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204271.01724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204271.01744: variable 'omit' from source: magic vars 26264 1727204271.02902: variable 'ansible_distribution_major_version' from source: facts 26264 1727204271.02915: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204271.02928: variable 'omit' from source: magic vars 26264 1727204271.02993: variable 'omit' from source: magic vars 26264 1727204271.03024: variable 'omit' from source: magic vars 26264 1727204271.03073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204271.03107: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204271.03127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204271.03157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204271.03173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204271.03205: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204271.03209: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204271.03211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204271.03320: Set connection var ansible_pipelining to False 26264 1727204271.03324: Set connection var ansible_connection to ssh 26264 1727204271.03326: Set connection var ansible_shell_type to sh 26264 1727204271.03331: Set connection var ansible_shell_executable to /bin/sh 26264 1727204271.03339: Set connection var ansible_timeout to 10 26264 1727204271.03346: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204271.03381: variable 'ansible_shell_executable' from source: unknown 26264 1727204271.03384: variable 'ansible_connection' from source: unknown 26264 1727204271.03387: variable 'ansible_module_compression' from source: unknown 26264 1727204271.03390: variable 'ansible_shell_type' from source: unknown 26264 1727204271.03392: variable 'ansible_shell_executable' from source: unknown 26264 1727204271.03394: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204271.03396: variable 'ansible_pipelining' from source: unknown 26264 1727204271.03398: variable 'ansible_timeout' from source: unknown 26264 1727204271.03403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204271.03728: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204271.03738: variable 'omit' from source: magic vars 26264 1727204271.03744: starting attempt loop 26264 1727204271.03747: running the handler 26264 1727204271.03767: _low_level_execute_command(): starting 26264 1727204271.03777: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204271.05811: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204271.05820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204271.05862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204271.05870: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 26264 1727204271.05973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204271.05979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204271.05993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204271.06064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204271.06081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204271.06097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204271.06241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204271.07866: stdout chunk (state=3): >>>/root <<< 26264 1727204271.08039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204271.08042: stderr chunk (state=3): >>><<< 26264 1727204271.08045: stdout chunk (state=3): >>><<< 26264 1727204271.08074: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204271.08091: _low_level_execute_command(): starting 26264 1727204271.08098: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204271.0807548-29113-239695926399437 `" && echo ansible-tmp-1727204271.0807548-29113-239695926399437="` echo /root/.ansible/tmp/ansible-tmp-1727204271.0807548-29113-239695926399437 `" ) && sleep 0' 26264 1727204271.09367: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204271.09378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204271.09396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204271.09410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204271.09450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204271.09461: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204271.09472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204271.09486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204271.09502: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204271.09508: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204271.09517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204271.09526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204271.09537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204271.09544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204271.09553: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204271.09562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204271.09633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204271.09647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204271.09660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204271.09833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204271.11594: stdout chunk (state=3): >>>ansible-tmp-1727204271.0807548-29113-239695926399437=/root/.ansible/tmp/ansible-tmp-1727204271.0807548-29113-239695926399437 <<< 26264 1727204271.11784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204271.11788: stdout chunk (state=3): >>><<< 26264 1727204271.11798: stderr chunk (state=3): >>><<< 26264 1727204271.11824: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204271.0807548-29113-239695926399437=/root/.ansible/tmp/ansible-tmp-1727204271.0807548-29113-239695926399437 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204271.11879: variable 'ansible_module_compression' from source: unknown 26264 1727204271.11926: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 26264 1727204271.11971: variable 'ansible_facts' from source: unknown 26264 1727204271.12056: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204271.0807548-29113-239695926399437/AnsiballZ_service_facts.py 26264 1727204271.12210: Sending initial data 26264 1727204271.12214: Sent initial data (162 bytes) 26264 1727204271.13351: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204271.13986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204271.13998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204271.14013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204271.14056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204271.14066: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204271.14076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204271.14671: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204271.14674: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204271.14677: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204271.14679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204271.14681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204271.14682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204271.14684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204271.14686: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204271.14688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204271.14828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204271.14843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204271.14849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204271.15030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204271.16701: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204271.16733: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204271.16783: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmp5z1ugbjo /root/.ansible/tmp/ansible-tmp-1727204271.0807548-29113-239695926399437/AnsiballZ_service_facts.py <<< 26264 1727204271.16813: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204271.18018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204271.18107: stderr chunk (state=3): >>><<< 26264 1727204271.18110: stdout chunk (state=3): >>><<< 26264 1727204271.18133: done transferring module to remote 26264 1727204271.18145: _low_level_execute_command(): starting 26264 1727204271.18153: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204271.0807548-29113-239695926399437/ /root/.ansible/tmp/ansible-tmp-1727204271.0807548-29113-239695926399437/AnsiballZ_service_facts.py && sleep 0' 26264 1727204271.18849: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204271.18861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204271.18875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204271.18895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204271.18930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204271.18936: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204271.18945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204271.18961: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204271.18969: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204271.18976: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204271.18989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204271.19003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204271.19013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204271.19020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204271.19026: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204271.19034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204271.19205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204271.19209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204271.19211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204271.19318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204271.21022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204271.21026: stdout chunk (state=3): >>><<< 26264 1727204271.21032: stderr chunk (state=3): >>><<< 26264 1727204271.21051: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204271.21057: _low_level_execute_command(): starting 26264 1727204271.21065: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204271.0807548-29113-239695926399437/AnsiballZ_service_facts.py && sleep 0' 26264 1727204271.21868: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204271.21876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204271.21887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204271.21901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204271.21945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204271.21948: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204271.21963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204271.21978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204271.21985: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204271.21992: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204271.22000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204271.22009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204271.22026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204271.22034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204271.22040: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204271.22050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204271.22126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204271.22146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204271.22162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204271.22241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204272.47822: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "stat<<< 26264 1727204272.47880: stdout chunk (state=3): >>>e": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.s<<< 26264 1727204272.47887: stdout chunk (state=3): >>>ervice": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 26264 1727204272.49204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204272.49208: stdout chunk (state=3): >>><<< 26264 1727204272.49216: stderr chunk (state=3): >>><<< 26264 1727204272.49243: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204272.49923: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204271.0807548-29113-239695926399437/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204272.49947: _low_level_execute_command(): starting 26264 1727204272.49958: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204271.0807548-29113-239695926399437/ > /dev/null 2>&1 && sleep 0' 26264 1727204272.50653: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204272.50674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204272.50691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204272.50717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204272.50768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204272.50781: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204272.50796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204272.50822: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204272.50837: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204272.50848: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204272.50862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204272.50879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204272.50897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204272.50912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204272.50928: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204272.50942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204272.51019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204272.51043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204272.51058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204272.51142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204272.52912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204272.53007: stderr chunk (state=3): >>><<< 26264 1727204272.53022: stdout chunk (state=3): >>><<< 26264 1727204272.53047: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204272.53056: handler run complete 26264 1727204272.53244: variable 'ansible_facts' from source: unknown 26264 1727204272.53414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204272.53914: variable 'ansible_facts' from source: unknown 26264 1727204272.54274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204272.54467: attempt loop complete, returning result 26264 1727204272.54478: _execute() done 26264 1727204272.54485: dumping result to json 26264 1727204272.54544: done dumping result, returning 26264 1727204272.54559: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-5ff5-08b0-00000000045d] 26264 1727204272.54572: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000045d ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204272.55354: no more pending results, returning what we have 26264 1727204272.55359: results queue empty 26264 1727204272.55360: checking for any_errors_fatal 26264 1727204272.55365: done checking for any_errors_fatal 26264 1727204272.55366: checking for max_fail_percentage 26264 1727204272.55368: done checking for max_fail_percentage 26264 1727204272.55369: checking to see if all hosts have failed and the running result is not ok 26264 1727204272.55370: done checking to see if all hosts have failed 26264 1727204272.55371: getting the remaining hosts for this loop 26264 1727204272.55372: done getting the remaining hosts for this loop 26264 1727204272.55377: getting the next task for host managed-node3 26264 1727204272.55382: done getting next task for host managed-node3 26264 1727204272.55386: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 26264 1727204272.55388: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204272.55402: getting variables 26264 1727204272.55404: in VariableManager get_vars() 26264 1727204272.55438: Calling all_inventory to load vars for managed-node3 26264 1727204272.55441: Calling groups_inventory to load vars for managed-node3 26264 1727204272.55443: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204272.55457: Calling all_plugins_play to load vars for managed-node3 26264 1727204272.55459: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204272.55463: Calling groups_plugins_play to load vars for managed-node3 26264 1727204272.55985: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000045d 26264 1727204272.55988: WORKER PROCESS EXITING 26264 1727204272.56889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204272.58490: done with get_vars() 26264 1727204272.58528: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:57:52 -0400 (0:00:01.579) 0:00:36.435 ***** 26264 1727204272.58629: entering _queue_task() for managed-node3/package_facts 26264 1727204272.58962: worker is 1 (out of 1 available) 26264 1727204272.58977: exiting _queue_task() for managed-node3/package_facts 26264 1727204272.58990: done queuing things up, now waiting for results queue to drain 26264 1727204272.58992: waiting for pending results... 26264 1727204272.59277: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 26264 1727204272.59440: in run() - task 0affcd87-79f5-5ff5-08b0-00000000045e 26264 1727204272.59458: variable 'ansible_search_path' from source: unknown 26264 1727204272.59467: variable 'ansible_search_path' from source: unknown 26264 1727204272.59509: calling self._execute() 26264 1727204272.59611: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204272.59623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204272.59639: variable 'omit' from source: magic vars 26264 1727204272.60026: variable 'ansible_distribution_major_version' from source: facts 26264 1727204272.60045: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204272.60056: variable 'omit' from source: magic vars 26264 1727204272.60121: variable 'omit' from source: magic vars 26264 1727204272.60160: variable 'omit' from source: magic vars 26264 1727204272.60214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204272.60256: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204272.60287: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204272.60314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204272.60331: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204272.60367: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204272.60377: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204272.60386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204272.60487: Set connection var ansible_pipelining to False 26264 1727204272.60495: Set connection var ansible_connection to ssh 26264 1727204272.60502: Set connection var ansible_shell_type to sh 26264 1727204272.60513: Set connection var ansible_shell_executable to /bin/sh 26264 1727204272.60529: Set connection var ansible_timeout to 10 26264 1727204272.60540: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204272.60571: variable 'ansible_shell_executable' from source: unknown 26264 1727204272.60579: variable 'ansible_connection' from source: unknown 26264 1727204272.60585: variable 'ansible_module_compression' from source: unknown 26264 1727204272.60591: variable 'ansible_shell_type' from source: unknown 26264 1727204272.60597: variable 'ansible_shell_executable' from source: unknown 26264 1727204272.60602: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204272.60608: variable 'ansible_pipelining' from source: unknown 26264 1727204272.60614: variable 'ansible_timeout' from source: unknown 26264 1727204272.60620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204272.60819: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204272.60836: variable 'omit' from source: magic vars 26264 1727204272.60850: starting attempt loop 26264 1727204272.60857: running the handler 26264 1727204272.60877: _low_level_execute_command(): starting 26264 1727204272.60890: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204272.61626: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204272.61643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204272.61659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204272.61681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204272.61727: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204272.61739: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204272.61754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204272.61775: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204272.61788: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204272.61800: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204272.61813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204272.61831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204272.61847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204272.61859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204272.61874: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204272.61887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204272.61962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204272.61982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204272.61997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204272.62078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204272.63600: stdout chunk (state=3): >>>/root <<< 26264 1727204272.63696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204272.63776: stderr chunk (state=3): >>><<< 26264 1727204272.63780: stdout chunk (state=3): >>><<< 26264 1727204272.63806: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204272.63823: _low_level_execute_command(): starting 26264 1727204272.63831: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204272.6380644-29172-149632395336392 `" && echo ansible-tmp-1727204272.6380644-29172-149632395336392="` echo /root/.ansible/tmp/ansible-tmp-1727204272.6380644-29172-149632395336392 `" ) && sleep 0' 26264 1727204272.64482: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204272.64490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204272.64499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204272.64514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204272.64556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204272.64563: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204272.64576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204272.64589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204272.64597: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204272.64603: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204272.64611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204272.64620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204272.64631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204272.64638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204272.64645: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204272.64657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204272.64729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204272.64744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204272.64749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204272.64835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204272.66639: stdout chunk (state=3): >>>ansible-tmp-1727204272.6380644-29172-149632395336392=/root/.ansible/tmp/ansible-tmp-1727204272.6380644-29172-149632395336392 <<< 26264 1727204272.66771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204272.66830: stderr chunk (state=3): >>><<< 26264 1727204272.66833: stdout chunk (state=3): >>><<< 26264 1727204272.66851: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204272.6380644-29172-149632395336392=/root/.ansible/tmp/ansible-tmp-1727204272.6380644-29172-149632395336392 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204272.66902: variable 'ansible_module_compression' from source: unknown 26264 1727204272.66955: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 26264 1727204272.67019: variable 'ansible_facts' from source: unknown 26264 1727204272.67212: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204272.6380644-29172-149632395336392/AnsiballZ_package_facts.py 26264 1727204272.67373: Sending initial data 26264 1727204272.67376: Sent initial data (162 bytes) 26264 1727204272.68459: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204272.68463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204272.68468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204272.68470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204272.68496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204272.68507: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204272.68522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204272.68536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204272.68543: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204272.68553: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204272.68558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204272.68570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204272.68582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204272.68588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204272.68638: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204272.68641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204272.68682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204272.68699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204272.68713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204272.68780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204272.70468: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204272.70475: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204272.70525: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpc55ullb1 /root/.ansible/tmp/ansible-tmp-1727204272.6380644-29172-149632395336392/AnsiballZ_package_facts.py <<< 26264 1727204272.70561: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204272.72868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204272.73127: stderr chunk (state=3): >>><<< 26264 1727204272.73131: stdout chunk (state=3): >>><<< 26264 1727204272.73134: done transferring module to remote 26264 1727204272.73143: _low_level_execute_command(): starting 26264 1727204272.73146: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204272.6380644-29172-149632395336392/ /root/.ansible/tmp/ansible-tmp-1727204272.6380644-29172-149632395336392/AnsiballZ_package_facts.py && sleep 0' 26264 1727204272.74182: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204272.74188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204272.74216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204272.74220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204272.74223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204272.74226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204272.74295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204272.74397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204272.74411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204272.76107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204272.76192: stderr chunk (state=3): >>><<< 26264 1727204272.76196: stdout chunk (state=3): >>><<< 26264 1727204272.76293: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204272.76297: _low_level_execute_command(): starting 26264 1727204272.76300: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204272.6380644-29172-149632395336392/AnsiballZ_package_facts.py && sleep 0' 26264 1727204272.76872: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204272.76890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204272.76905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204272.76924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204272.76979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204272.76992: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204272.77007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204272.77026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204272.77042: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204272.77057: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204272.77073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204272.77088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204272.77104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204272.77116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204272.77128: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204272.77152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204272.77233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204272.77252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204272.77270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204272.77363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204273.23061: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 26264 1727204273.23125: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 26264 1727204273.23167: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 26264 1727204273.23188: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 26264 1727204273.23193: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "rel<<< 26264 1727204273.23224: stdout chunk (state=3): >>>ease": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 26264 1727204273.23229: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 26264 1727204273.23247: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], <<< 26264 1727204273.23262: stdout chunk (state=3): >>>"slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 26264 1727204273.23271: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 26264 1727204273.23287: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 26264 1727204273.23293: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}],<<< 26264 1727204273.23296: stdout chunk (state=3): >>> "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 26264 1727204273.23302: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 26264 1727204273.23306: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 26264 1727204273.23313: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 26264 1727204273.24915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204273.24918: stdout chunk (state=3): >>><<< 26264 1727204273.24921: stderr chunk (state=3): >>><<< 26264 1727204273.25380: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204273.28013: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204272.6380644-29172-149632395336392/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204273.28044: _low_level_execute_command(): starting 26264 1727204273.28191: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204272.6380644-29172-149632395336392/ > /dev/null 2>&1 && sleep 0' 26264 1727204273.29425: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204273.29437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204273.29448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204273.29468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204273.29506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204273.29513: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204273.29523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204273.29538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204273.29549: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204273.29560: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204273.29570: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204273.29580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204273.29592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204273.29600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204273.29606: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204273.29616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204273.29697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204273.29719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204273.29732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204273.29804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204273.31670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204273.31674: stdout chunk (state=3): >>><<< 26264 1727204273.31677: stderr chunk (state=3): >>><<< 26264 1727204273.31702: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204273.31705: handler run complete 26264 1727204273.32722: variable 'ansible_facts' from source: unknown 26264 1727204273.37776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204273.39894: variable 'ansible_facts' from source: unknown 26264 1727204273.40349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204273.41151: attempt loop complete, returning result 26264 1727204273.41170: _execute() done 26264 1727204273.41173: dumping result to json 26264 1727204273.41398: done dumping result, returning 26264 1727204273.41407: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-5ff5-08b0-00000000045e] 26264 1727204273.41410: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000045e 26264 1727204273.50243: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000045e 26264 1727204273.50247: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204273.50357: no more pending results, returning what we have 26264 1727204273.50360: results queue empty 26264 1727204273.50361: checking for any_errors_fatal 26264 1727204273.50365: done checking for any_errors_fatal 26264 1727204273.50366: checking for max_fail_percentage 26264 1727204273.50368: done checking for max_fail_percentage 26264 1727204273.50368: checking to see if all hosts have failed and the running result is not ok 26264 1727204273.50369: done checking to see if all hosts have failed 26264 1727204273.50370: getting the remaining hosts for this loop 26264 1727204273.50371: done getting the remaining hosts for this loop 26264 1727204273.50374: getting the next task for host managed-node3 26264 1727204273.50379: done getting next task for host managed-node3 26264 1727204273.50382: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 26264 1727204273.50383: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204273.51004: getting variables 26264 1727204273.51005: in VariableManager get_vars() 26264 1727204273.51030: Calling all_inventory to load vars for managed-node3 26264 1727204273.51033: Calling groups_inventory to load vars for managed-node3 26264 1727204273.51035: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204273.51041: Calling all_plugins_play to load vars for managed-node3 26264 1727204273.51044: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204273.51046: Calling groups_plugins_play to load vars for managed-node3 26264 1727204273.54477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204273.57927: done with get_vars() 26264 1727204273.57960: done getting variables 26264 1727204273.58015: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:53 -0400 (0:00:00.994) 0:00:37.429 ***** 26264 1727204273.58050: entering _queue_task() for managed-node3/debug 26264 1727204273.58873: worker is 1 (out of 1 available) 26264 1727204273.58888: exiting _queue_task() for managed-node3/debug 26264 1727204273.58902: done queuing things up, now waiting for results queue to drain 26264 1727204273.58903: waiting for pending results... 26264 1727204273.59547: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 26264 1727204273.60371: in run() - task 0affcd87-79f5-5ff5-08b0-00000000005d 26264 1727204273.60397: variable 'ansible_search_path' from source: unknown 26264 1727204273.60407: variable 'ansible_search_path' from source: unknown 26264 1727204273.60458: calling self._execute() 26264 1727204273.60578: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204273.60590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204273.60608: variable 'omit' from source: magic vars 26264 1727204273.61023: variable 'ansible_distribution_major_version' from source: facts 26264 1727204273.61486: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204273.61498: variable 'omit' from source: magic vars 26264 1727204273.61541: variable 'omit' from source: magic vars 26264 1727204273.61659: variable 'network_provider' from source: set_fact 26264 1727204273.62292: variable 'omit' from source: magic vars 26264 1727204273.62343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204273.62391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204273.62436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204273.62467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204273.62595: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204273.62629: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204273.62637: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204273.62644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204273.62755: Set connection var ansible_pipelining to False 26264 1727204273.62763: Set connection var ansible_connection to ssh 26264 1727204273.62772: Set connection var ansible_shell_type to sh 26264 1727204273.62782: Set connection var ansible_shell_executable to /bin/sh 26264 1727204273.62794: Set connection var ansible_timeout to 10 26264 1727204273.62805: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204273.62837: variable 'ansible_shell_executable' from source: unknown 26264 1727204273.62845: variable 'ansible_connection' from source: unknown 26264 1727204273.62854: variable 'ansible_module_compression' from source: unknown 26264 1727204273.62860: variable 'ansible_shell_type' from source: unknown 26264 1727204273.62946: variable 'ansible_shell_executable' from source: unknown 26264 1727204273.62956: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204273.62966: variable 'ansible_pipelining' from source: unknown 26264 1727204273.62973: variable 'ansible_timeout' from source: unknown 26264 1727204273.62980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204273.63124: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204273.63616: variable 'omit' from source: magic vars 26264 1727204273.63628: starting attempt loop 26264 1727204273.63634: running the handler 26264 1727204273.63687: handler run complete 26264 1727204273.63706: attempt loop complete, returning result 26264 1727204273.63714: _execute() done 26264 1727204273.63721: dumping result to json 26264 1727204273.63727: done dumping result, returning 26264 1727204273.63739: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-5ff5-08b0-00000000005d] 26264 1727204273.63751: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000005d 26264 1727204273.63867: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000005d ok: [managed-node3] => {} MSG: Using network provider: nm 26264 1727204273.63934: no more pending results, returning what we have 26264 1727204273.63938: results queue empty 26264 1727204273.63939: checking for any_errors_fatal 26264 1727204273.63951: done checking for any_errors_fatal 26264 1727204273.63952: checking for max_fail_percentage 26264 1727204273.63954: done checking for max_fail_percentage 26264 1727204273.63955: checking to see if all hosts have failed and the running result is not ok 26264 1727204273.63956: done checking to see if all hosts have failed 26264 1727204273.63956: getting the remaining hosts for this loop 26264 1727204273.63958: done getting the remaining hosts for this loop 26264 1727204273.63962: getting the next task for host managed-node3 26264 1727204273.63971: done getting next task for host managed-node3 26264 1727204273.63975: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 26264 1727204273.63977: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204273.63991: getting variables 26264 1727204273.63993: in VariableManager get_vars() 26264 1727204273.64032: Calling all_inventory to load vars for managed-node3 26264 1727204273.64035: Calling groups_inventory to load vars for managed-node3 26264 1727204273.64037: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204273.64048: Calling all_plugins_play to load vars for managed-node3 26264 1727204273.64051: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204273.64055: Calling groups_plugins_play to load vars for managed-node3 26264 1727204273.65090: WORKER PROCESS EXITING 26264 1727204273.67301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204273.70161: done with get_vars() 26264 1727204273.70902: done getting variables 26264 1727204273.70977: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:53 -0400 (0:00:00.129) 0:00:37.559 ***** 26264 1727204273.71014: entering _queue_task() for managed-node3/fail 26264 1727204273.71829: worker is 1 (out of 1 available) 26264 1727204273.71841: exiting _queue_task() for managed-node3/fail 26264 1727204273.71857: done queuing things up, now waiting for results queue to drain 26264 1727204273.71859: waiting for pending results... 26264 1727204273.72183: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 26264 1727204273.72484: in run() - task 0affcd87-79f5-5ff5-08b0-00000000005e 26264 1727204273.72568: variable 'ansible_search_path' from source: unknown 26264 1727204273.72612: variable 'ansible_search_path' from source: unknown 26264 1727204273.72655: calling self._execute() 26264 1727204273.72807: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204273.72920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204273.72943: variable 'omit' from source: magic vars 26264 1727204273.73877: variable 'ansible_distribution_major_version' from source: facts 26264 1727204273.73899: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204273.74231: variable 'network_state' from source: role '' defaults 26264 1727204273.74248: Evaluated conditional (network_state != {}): False 26264 1727204273.74256: when evaluation is False, skipping this task 26264 1727204273.74266: _execute() done 26264 1727204273.74274: dumping result to json 26264 1727204273.74282: done dumping result, returning 26264 1727204273.74292: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-5ff5-08b0-00000000005e] 26264 1727204273.74303: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000005e 26264 1727204273.74418: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000005e 26264 1727204273.74426: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204273.74490: no more pending results, returning what we have 26264 1727204273.74494: results queue empty 26264 1727204273.74495: checking for any_errors_fatal 26264 1727204273.74502: done checking for any_errors_fatal 26264 1727204273.74503: checking for max_fail_percentage 26264 1727204273.74504: done checking for max_fail_percentage 26264 1727204273.74505: checking to see if all hosts have failed and the running result is not ok 26264 1727204273.74506: done checking to see if all hosts have failed 26264 1727204273.74507: getting the remaining hosts for this loop 26264 1727204273.74508: done getting the remaining hosts for this loop 26264 1727204273.74512: getting the next task for host managed-node3 26264 1727204273.74518: done getting next task for host managed-node3 26264 1727204273.74522: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 26264 1727204273.74524: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204273.74537: getting variables 26264 1727204273.74538: in VariableManager get_vars() 26264 1727204273.74583: Calling all_inventory to load vars for managed-node3 26264 1727204273.74586: Calling groups_inventory to load vars for managed-node3 26264 1727204273.74588: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204273.74600: Calling all_plugins_play to load vars for managed-node3 26264 1727204273.74603: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204273.74606: Calling groups_plugins_play to load vars for managed-node3 26264 1727204273.76937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204273.79517: done with get_vars() 26264 1727204273.79551: done getting variables 26264 1727204273.79616: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:53 -0400 (0:00:00.086) 0:00:37.645 ***** 26264 1727204273.79654: entering _queue_task() for managed-node3/fail 26264 1727204273.80070: worker is 1 (out of 1 available) 26264 1727204273.80083: exiting _queue_task() for managed-node3/fail 26264 1727204273.80096: done queuing things up, now waiting for results queue to drain 26264 1727204273.80098: waiting for pending results... 26264 1727204273.80390: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 26264 1727204273.80506: in run() - task 0affcd87-79f5-5ff5-08b0-00000000005f 26264 1727204273.80525: variable 'ansible_search_path' from source: unknown 26264 1727204273.80532: variable 'ansible_search_path' from source: unknown 26264 1727204273.80583: calling self._execute() 26264 1727204273.80687: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204273.80700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204273.80715: variable 'omit' from source: magic vars 26264 1727204273.81134: variable 'ansible_distribution_major_version' from source: facts 26264 1727204273.81157: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204273.81293: variable 'network_state' from source: role '' defaults 26264 1727204273.81313: Evaluated conditional (network_state != {}): False 26264 1727204273.81321: when evaluation is False, skipping this task 26264 1727204273.81328: _execute() done 26264 1727204273.81335: dumping result to json 26264 1727204273.81343: done dumping result, returning 26264 1727204273.81358: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-5ff5-08b0-00000000005f] 26264 1727204273.81374: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000005f skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204273.81534: no more pending results, returning what we have 26264 1727204273.81538: results queue empty 26264 1727204273.81539: checking for any_errors_fatal 26264 1727204273.81550: done checking for any_errors_fatal 26264 1727204273.81551: checking for max_fail_percentage 26264 1727204273.81553: done checking for max_fail_percentage 26264 1727204273.81554: checking to see if all hosts have failed and the running result is not ok 26264 1727204273.81555: done checking to see if all hosts have failed 26264 1727204273.81556: getting the remaining hosts for this loop 26264 1727204273.81558: done getting the remaining hosts for this loop 26264 1727204273.81563: getting the next task for host managed-node3 26264 1727204273.81571: done getting next task for host managed-node3 26264 1727204273.81576: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 26264 1727204273.81578: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204273.81594: getting variables 26264 1727204273.81596: in VariableManager get_vars() 26264 1727204273.81639: Calling all_inventory to load vars for managed-node3 26264 1727204273.81642: Calling groups_inventory to load vars for managed-node3 26264 1727204273.81645: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204273.81661: Calling all_plugins_play to load vars for managed-node3 26264 1727204273.81667: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204273.81671: Calling groups_plugins_play to load vars for managed-node3 26264 1727204273.83356: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000005f 26264 1727204273.83360: WORKER PROCESS EXITING 26264 1727204273.85624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204273.89150: done with get_vars() 26264 1727204273.89185: done getting variables 26264 1727204273.89252: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:53 -0400 (0:00:00.096) 0:00:37.742 ***** 26264 1727204273.89287: entering _queue_task() for managed-node3/fail 26264 1727204273.89660: worker is 1 (out of 1 available) 26264 1727204273.89675: exiting _queue_task() for managed-node3/fail 26264 1727204273.89688: done queuing things up, now waiting for results queue to drain 26264 1727204273.89690: waiting for pending results... 26264 1727204273.89995: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 26264 1727204273.90118: in run() - task 0affcd87-79f5-5ff5-08b0-000000000060 26264 1727204273.90142: variable 'ansible_search_path' from source: unknown 26264 1727204273.90152: variable 'ansible_search_path' from source: unknown 26264 1727204273.90196: calling self._execute() 26264 1727204273.90299: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204273.90310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204273.90326: variable 'omit' from source: magic vars 26264 1727204273.90726: variable 'ansible_distribution_major_version' from source: facts 26264 1727204273.90744: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204273.90938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204273.93558: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204273.93669: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204273.93718: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204273.93761: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204273.93794: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204273.93882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204273.93939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204273.93978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204273.94027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204273.94054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204273.94183: variable 'ansible_distribution_major_version' from source: facts 26264 1727204273.94206: Evaluated conditional (ansible_distribution_major_version | int > 9): False 26264 1727204273.94215: when evaluation is False, skipping this task 26264 1727204273.94223: _execute() done 26264 1727204273.94231: dumping result to json 26264 1727204273.94239: done dumping result, returning 26264 1727204273.94260: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-5ff5-08b0-000000000060] 26264 1727204273.94274: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000060 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 26264 1727204273.94431: no more pending results, returning what we have 26264 1727204273.94436: results queue empty 26264 1727204273.94437: checking for any_errors_fatal 26264 1727204273.94444: done checking for any_errors_fatal 26264 1727204273.94445: checking for max_fail_percentage 26264 1727204273.94446: done checking for max_fail_percentage 26264 1727204273.94447: checking to see if all hosts have failed and the running result is not ok 26264 1727204273.94451: done checking to see if all hosts have failed 26264 1727204273.94452: getting the remaining hosts for this loop 26264 1727204273.94454: done getting the remaining hosts for this loop 26264 1727204273.94459: getting the next task for host managed-node3 26264 1727204273.94468: done getting next task for host managed-node3 26264 1727204273.94473: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 26264 1727204273.94475: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204273.94490: getting variables 26264 1727204273.94492: in VariableManager get_vars() 26264 1727204273.94531: Calling all_inventory to load vars for managed-node3 26264 1727204273.94534: Calling groups_inventory to load vars for managed-node3 26264 1727204273.94537: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204273.94551: Calling all_plugins_play to load vars for managed-node3 26264 1727204273.94555: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204273.94559: Calling groups_plugins_play to load vars for managed-node3 26264 1727204273.95583: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000060 26264 1727204273.95586: WORKER PROCESS EXITING 26264 1727204273.96435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204273.98316: done with get_vars() 26264 1727204273.98340: done getting variables 26264 1727204273.98406: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:53 -0400 (0:00:00.091) 0:00:37.833 ***** 26264 1727204273.98440: entering _queue_task() for managed-node3/dnf 26264 1727204273.98779: worker is 1 (out of 1 available) 26264 1727204273.98792: exiting _queue_task() for managed-node3/dnf 26264 1727204273.98805: done queuing things up, now waiting for results queue to drain 26264 1727204273.98807: waiting for pending results... 26264 1727204273.99092: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 26264 1727204273.99218: in run() - task 0affcd87-79f5-5ff5-08b0-000000000061 26264 1727204273.99239: variable 'ansible_search_path' from source: unknown 26264 1727204273.99256: variable 'ansible_search_path' from source: unknown 26264 1727204273.99302: calling self._execute() 26264 1727204273.99404: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204273.99416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204273.99432: variable 'omit' from source: magic vars 26264 1727204273.99836: variable 'ansible_distribution_major_version' from source: facts 26264 1727204273.99857: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204274.00080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204274.02521: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204274.02600: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204274.02650: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204274.02714: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204274.02752: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204274.02839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.02891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.02922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.02985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.03015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.03170: variable 'ansible_distribution' from source: facts 26264 1727204274.03182: variable 'ansible_distribution_major_version' from source: facts 26264 1727204274.03205: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 26264 1727204274.03342: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204274.03509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.03541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.03576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.03631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.03663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.03743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.03775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.03807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.03855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.03880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.03931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.03967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.04014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.04067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.04092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.04285: variable 'network_connections' from source: play vars 26264 1727204274.04302: variable 'profile' from source: play vars 26264 1727204274.04402: variable 'profile' from source: play vars 26264 1727204274.04415: variable 'interface' from source: set_fact 26264 1727204274.04490: variable 'interface' from source: set_fact 26264 1727204274.04585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204274.04850: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204274.04917: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204274.04955: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204274.05021: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204274.05073: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204274.05154: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204274.05196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.05233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204274.05289: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204274.05639: variable 'network_connections' from source: play vars 26264 1727204274.05650: variable 'profile' from source: play vars 26264 1727204274.05759: variable 'profile' from source: play vars 26264 1727204274.05787: variable 'interface' from source: set_fact 26264 1727204274.05853: variable 'interface' from source: set_fact 26264 1727204274.05885: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26264 1727204274.05894: when evaluation is False, skipping this task 26264 1727204274.05906: _execute() done 26264 1727204274.05915: dumping result to json 26264 1727204274.05924: done dumping result, returning 26264 1727204274.05936: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-5ff5-08b0-000000000061] 26264 1727204274.05953: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000061 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26264 1727204274.06173: no more pending results, returning what we have 26264 1727204274.06177: results queue empty 26264 1727204274.06178: checking for any_errors_fatal 26264 1727204274.06185: done checking for any_errors_fatal 26264 1727204274.06186: checking for max_fail_percentage 26264 1727204274.06187: done checking for max_fail_percentage 26264 1727204274.06188: checking to see if all hosts have failed and the running result is not ok 26264 1727204274.06190: done checking to see if all hosts have failed 26264 1727204274.06190: getting the remaining hosts for this loop 26264 1727204274.06192: done getting the remaining hosts for this loop 26264 1727204274.06196: getting the next task for host managed-node3 26264 1727204274.06204: done getting next task for host managed-node3 26264 1727204274.06208: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 26264 1727204274.06211: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204274.06225: getting variables 26264 1727204274.06227: in VariableManager get_vars() 26264 1727204274.06277: Calling all_inventory to load vars for managed-node3 26264 1727204274.06281: Calling groups_inventory to load vars for managed-node3 26264 1727204274.06284: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204274.06296: Calling all_plugins_play to load vars for managed-node3 26264 1727204274.06300: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204274.06303: Calling groups_plugins_play to load vars for managed-node3 26264 1727204274.07283: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000061 26264 1727204274.07287: WORKER PROCESS EXITING 26264 1727204274.08112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204274.09977: done with get_vars() 26264 1727204274.10005: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 26264 1727204274.10206: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:54 -0400 (0:00:00.117) 0:00:37.951 ***** 26264 1727204274.10239: entering _queue_task() for managed-node3/yum 26264 1727204274.10588: worker is 1 (out of 1 available) 26264 1727204274.10600: exiting _queue_task() for managed-node3/yum 26264 1727204274.10613: done queuing things up, now waiting for results queue to drain 26264 1727204274.10614: waiting for pending results... 26264 1727204274.11173: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 26264 1727204274.11313: in run() - task 0affcd87-79f5-5ff5-08b0-000000000062 26264 1727204274.11332: variable 'ansible_search_path' from source: unknown 26264 1727204274.11340: variable 'ansible_search_path' from source: unknown 26264 1727204274.11386: calling self._execute() 26264 1727204274.11486: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204274.11497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204274.11517: variable 'omit' from source: magic vars 26264 1727204274.11977: variable 'ansible_distribution_major_version' from source: facts 26264 1727204274.12052: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204274.12291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204274.16417: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204274.16502: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204274.16547: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204274.16587: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204274.16621: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204274.16708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.17109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.17143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.17195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.17215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.17322: variable 'ansible_distribution_major_version' from source: facts 26264 1727204274.17345: Evaluated conditional (ansible_distribution_major_version | int < 8): False 26264 1727204274.17353: when evaluation is False, skipping this task 26264 1727204274.17360: _execute() done 26264 1727204274.17371: dumping result to json 26264 1727204274.17385: done dumping result, returning 26264 1727204274.17397: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-5ff5-08b0-000000000062] 26264 1727204274.17407: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000062 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 26264 1727204274.17562: no more pending results, returning what we have 26264 1727204274.17569: results queue empty 26264 1727204274.17570: checking for any_errors_fatal 26264 1727204274.17576: done checking for any_errors_fatal 26264 1727204274.17577: checking for max_fail_percentage 26264 1727204274.17579: done checking for max_fail_percentage 26264 1727204274.17580: checking to see if all hosts have failed and the running result is not ok 26264 1727204274.17581: done checking to see if all hosts have failed 26264 1727204274.17582: getting the remaining hosts for this loop 26264 1727204274.17584: done getting the remaining hosts for this loop 26264 1727204274.17588: getting the next task for host managed-node3 26264 1727204274.17595: done getting next task for host managed-node3 26264 1727204274.17600: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 26264 1727204274.17602: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204274.17615: getting variables 26264 1727204274.17617: in VariableManager get_vars() 26264 1727204274.17658: Calling all_inventory to load vars for managed-node3 26264 1727204274.17661: Calling groups_inventory to load vars for managed-node3 26264 1727204274.17665: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204274.17677: Calling all_plugins_play to load vars for managed-node3 26264 1727204274.17680: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204274.17683: Calling groups_plugins_play to load vars for managed-node3 26264 1727204274.18696: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000062 26264 1727204274.18700: WORKER PROCESS EXITING 26264 1727204274.19706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204274.21755: done with get_vars() 26264 1727204274.21782: done getting variables 26264 1727204274.21840: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:54 -0400 (0:00:00.116) 0:00:38.067 ***** 26264 1727204274.21873: entering _queue_task() for managed-node3/fail 26264 1727204274.22191: worker is 1 (out of 1 available) 26264 1727204274.22203: exiting _queue_task() for managed-node3/fail 26264 1727204274.22216: done queuing things up, now waiting for results queue to drain 26264 1727204274.22218: waiting for pending results... 26264 1727204274.22669: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 26264 1727204274.22799: in run() - task 0affcd87-79f5-5ff5-08b0-000000000063 26264 1727204274.22819: variable 'ansible_search_path' from source: unknown 26264 1727204274.22829: variable 'ansible_search_path' from source: unknown 26264 1727204274.22872: calling self._execute() 26264 1727204274.23019: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204274.23032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204274.23047: variable 'omit' from source: magic vars 26264 1727204274.23822: variable 'ansible_distribution_major_version' from source: facts 26264 1727204274.23896: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204274.24079: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204274.24583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204274.27517: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204274.27590: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204274.27760: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204274.27802: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204274.27837: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204274.27922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.27977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.28075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.28123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.28246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.28343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.28427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.28459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.28511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.28532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.28591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.28629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.28661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.28753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.28778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.29092: variable 'network_connections' from source: play vars 26264 1727204274.29111: variable 'profile' from source: play vars 26264 1727204274.29198: variable 'profile' from source: play vars 26264 1727204274.29208: variable 'interface' from source: set_fact 26264 1727204274.29278: variable 'interface' from source: set_fact 26264 1727204274.29354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204274.29533: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204274.29582: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204274.29749: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204274.29791: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204274.29838: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204274.29861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204274.29890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.29920: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204274.29973: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204274.30387: variable 'network_connections' from source: play vars 26264 1727204274.30397: variable 'profile' from source: play vars 26264 1727204274.30468: variable 'profile' from source: play vars 26264 1727204274.30477: variable 'interface' from source: set_fact 26264 1727204274.30538: variable 'interface' from source: set_fact 26264 1727204274.30571: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26264 1727204274.30580: when evaluation is False, skipping this task 26264 1727204274.30585: _execute() done 26264 1727204274.30591: dumping result to json 26264 1727204274.30596: done dumping result, returning 26264 1727204274.30620: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-5ff5-08b0-000000000063] 26264 1727204274.30638: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000063 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26264 1727204274.30820: no more pending results, returning what we have 26264 1727204274.30824: results queue empty 26264 1727204274.30825: checking for any_errors_fatal 26264 1727204274.30832: done checking for any_errors_fatal 26264 1727204274.30833: checking for max_fail_percentage 26264 1727204274.30834: done checking for max_fail_percentage 26264 1727204274.30835: checking to see if all hosts have failed and the running result is not ok 26264 1727204274.30836: done checking to see if all hosts have failed 26264 1727204274.30837: getting the remaining hosts for this loop 26264 1727204274.30839: done getting the remaining hosts for this loop 26264 1727204274.30843: getting the next task for host managed-node3 26264 1727204274.30850: done getting next task for host managed-node3 26264 1727204274.30855: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 26264 1727204274.30857: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204274.30873: getting variables 26264 1727204274.30875: in VariableManager get_vars() 26264 1727204274.30917: Calling all_inventory to load vars for managed-node3 26264 1727204274.30921: Calling groups_inventory to load vars for managed-node3 26264 1727204274.30923: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204274.30935: Calling all_plugins_play to load vars for managed-node3 26264 1727204274.30938: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204274.30941: Calling groups_plugins_play to load vars for managed-node3 26264 1727204274.31983: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000063 26264 1727204274.31987: WORKER PROCESS EXITING 26264 1727204274.32821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204274.35109: done with get_vars() 26264 1727204274.35140: done getting variables 26264 1727204274.35200: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:54 -0400 (0:00:00.133) 0:00:38.201 ***** 26264 1727204274.35232: entering _queue_task() for managed-node3/package 26264 1727204274.35553: worker is 1 (out of 1 available) 26264 1727204274.35641: exiting _queue_task() for managed-node3/package 26264 1727204274.35657: done queuing things up, now waiting for results queue to drain 26264 1727204274.35659: waiting for pending results... 26264 1727204274.35926: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 26264 1727204274.36040: in run() - task 0affcd87-79f5-5ff5-08b0-000000000064 26264 1727204274.36062: variable 'ansible_search_path' from source: unknown 26264 1727204274.36074: variable 'ansible_search_path' from source: unknown 26264 1727204274.36114: calling self._execute() 26264 1727204274.36217: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204274.36228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204274.36240: variable 'omit' from source: magic vars 26264 1727204274.36821: variable 'ansible_distribution_major_version' from source: facts 26264 1727204274.36841: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204274.37050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204274.37401: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204274.37451: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204274.37494: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204274.37576: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204274.37694: variable 'network_packages' from source: role '' defaults 26264 1727204274.37808: variable '__network_provider_setup' from source: role '' defaults 26264 1727204274.37822: variable '__network_service_name_default_nm' from source: role '' defaults 26264 1727204274.37901: variable '__network_service_name_default_nm' from source: role '' defaults 26264 1727204274.37915: variable '__network_packages_default_nm' from source: role '' defaults 26264 1727204274.37980: variable '__network_packages_default_nm' from source: role '' defaults 26264 1727204274.38177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204274.41103: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204274.41188: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204274.41229: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204274.41273: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204274.41304: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204274.41392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.41426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.41456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.41517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.41537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.41591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.41621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.41652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.41704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.41723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.42110: variable '__network_packages_default_gobject_packages' from source: role '' defaults 26264 1727204274.42220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.42247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.42286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.42332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.42350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.42450: variable 'ansible_python' from source: facts 26264 1727204274.42539: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 26264 1727204274.42631: variable '__network_wpa_supplicant_required' from source: role '' defaults 26264 1727204274.42722: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26264 1727204274.42855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.42889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.42925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.42972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.43031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.43172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.43206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.43258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.43331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.43398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.43550: variable 'network_connections' from source: play vars 26264 1727204274.43568: variable 'profile' from source: play vars 26264 1727204274.43805: variable 'profile' from source: play vars 26264 1727204274.43818: variable 'interface' from source: set_fact 26264 1727204274.43893: variable 'interface' from source: set_fact 26264 1727204274.43974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204274.44006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204274.44046: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.44086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204274.44171: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204274.44667: variable 'network_connections' from source: play vars 26264 1727204274.44680: variable 'profile' from source: play vars 26264 1727204274.44788: variable 'profile' from source: play vars 26264 1727204274.44801: variable 'interface' from source: set_fact 26264 1727204274.44877: variable 'interface' from source: set_fact 26264 1727204274.44916: variable '__network_packages_default_wireless' from source: role '' defaults 26264 1727204274.45002: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204274.45320: variable 'network_connections' from source: play vars 26264 1727204274.45330: variable 'profile' from source: play vars 26264 1727204274.45398: variable 'profile' from source: play vars 26264 1727204274.45407: variable 'interface' from source: set_fact 26264 1727204274.45513: variable 'interface' from source: set_fact 26264 1727204274.45548: variable '__network_packages_default_team' from source: role '' defaults 26264 1727204274.45636: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204274.45983: variable 'network_connections' from source: play vars 26264 1727204274.45993: variable 'profile' from source: play vars 26264 1727204274.46059: variable 'profile' from source: play vars 26264 1727204274.46072: variable 'interface' from source: set_fact 26264 1727204274.46175: variable 'interface' from source: set_fact 26264 1727204274.46238: variable '__network_service_name_default_initscripts' from source: role '' defaults 26264 1727204274.46347: variable '__network_service_name_default_initscripts' from source: role '' defaults 26264 1727204274.46560: variable '__network_packages_default_initscripts' from source: role '' defaults 26264 1727204274.46631: variable '__network_packages_default_initscripts' from source: role '' defaults 26264 1727204274.46989: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 26264 1727204274.47930: variable 'network_connections' from source: play vars 26264 1727204274.48067: variable 'profile' from source: play vars 26264 1727204274.48123: variable 'profile' from source: play vars 26264 1727204274.48131: variable 'interface' from source: set_fact 26264 1727204274.48201: variable 'interface' from source: set_fact 26264 1727204274.48282: variable 'ansible_distribution' from source: facts 26264 1727204274.48291: variable '__network_rh_distros' from source: role '' defaults 26264 1727204274.48300: variable 'ansible_distribution_major_version' from source: facts 26264 1727204274.48317: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 26264 1727204274.48655: variable 'ansible_distribution' from source: facts 26264 1727204274.48713: variable '__network_rh_distros' from source: role '' defaults 26264 1727204274.48724: variable 'ansible_distribution_major_version' from source: facts 26264 1727204274.48742: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 26264 1727204274.49021: variable 'ansible_distribution' from source: facts 26264 1727204274.49146: variable '__network_rh_distros' from source: role '' defaults 26264 1727204274.49157: variable 'ansible_distribution_major_version' from source: facts 26264 1727204274.49199: variable 'network_provider' from source: set_fact 26264 1727204274.49218: variable 'ansible_facts' from source: unknown 26264 1727204274.50775: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 26264 1727204274.50784: when evaluation is False, skipping this task 26264 1727204274.50790: _execute() done 26264 1727204274.50797: dumping result to json 26264 1727204274.50878: done dumping result, returning 26264 1727204274.50892: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-5ff5-08b0-000000000064] 26264 1727204274.50901: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000064 skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 26264 1727204274.51048: no more pending results, returning what we have 26264 1727204274.51053: results queue empty 26264 1727204274.51054: checking for any_errors_fatal 26264 1727204274.51061: done checking for any_errors_fatal 26264 1727204274.51061: checking for max_fail_percentage 26264 1727204274.51063: done checking for max_fail_percentage 26264 1727204274.51065: checking to see if all hosts have failed and the running result is not ok 26264 1727204274.51067: done checking to see if all hosts have failed 26264 1727204274.51067: getting the remaining hosts for this loop 26264 1727204274.51069: done getting the remaining hosts for this loop 26264 1727204274.51074: getting the next task for host managed-node3 26264 1727204274.51081: done getting next task for host managed-node3 26264 1727204274.51085: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 26264 1727204274.51088: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204274.51102: getting variables 26264 1727204274.51104: in VariableManager get_vars() 26264 1727204274.51144: Calling all_inventory to load vars for managed-node3 26264 1727204274.51147: Calling groups_inventory to load vars for managed-node3 26264 1727204274.51150: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204274.51161: Calling all_plugins_play to load vars for managed-node3 26264 1727204274.51171: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204274.51174: Calling groups_plugins_play to load vars for managed-node3 26264 1727204274.52185: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000064 26264 1727204274.52189: WORKER PROCESS EXITING 26264 1727204274.53941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204274.58677: done with get_vars() 26264 1727204274.58712: done getting variables 26264 1727204274.58779: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:54 -0400 (0:00:00.235) 0:00:38.437 ***** 26264 1727204274.58812: entering _queue_task() for managed-node3/package 26264 1727204274.60118: worker is 1 (out of 1 available) 26264 1727204274.60131: exiting _queue_task() for managed-node3/package 26264 1727204274.60145: done queuing things up, now waiting for results queue to drain 26264 1727204274.60146: waiting for pending results... 26264 1727204274.60996: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 26264 1727204274.61841: in run() - task 0affcd87-79f5-5ff5-08b0-000000000065 26264 1727204274.61935: variable 'ansible_search_path' from source: unknown 26264 1727204274.61945: variable 'ansible_search_path' from source: unknown 26264 1727204274.61992: calling self._execute() 26264 1727204274.62126: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204274.62255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204274.62373: variable 'omit' from source: magic vars 26264 1727204274.63153: variable 'ansible_distribution_major_version' from source: facts 26264 1727204274.63177: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204274.63418: variable 'network_state' from source: role '' defaults 26264 1727204274.63438: Evaluated conditional (network_state != {}): False 26264 1727204274.63454: when evaluation is False, skipping this task 26264 1727204274.63467: _execute() done 26264 1727204274.63480: dumping result to json 26264 1727204274.63565: done dumping result, returning 26264 1727204274.63579: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-5ff5-08b0-000000000065] 26264 1727204274.63592: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000065 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204274.63759: no more pending results, returning what we have 26264 1727204274.63766: results queue empty 26264 1727204274.63766: checking for any_errors_fatal 26264 1727204274.63772: done checking for any_errors_fatal 26264 1727204274.63773: checking for max_fail_percentage 26264 1727204274.63775: done checking for max_fail_percentage 26264 1727204274.63776: checking to see if all hosts have failed and the running result is not ok 26264 1727204274.63777: done checking to see if all hosts have failed 26264 1727204274.63778: getting the remaining hosts for this loop 26264 1727204274.63779: done getting the remaining hosts for this loop 26264 1727204274.63784: getting the next task for host managed-node3 26264 1727204274.63790: done getting next task for host managed-node3 26264 1727204274.63795: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 26264 1727204274.63797: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204274.63818: getting variables 26264 1727204274.63820: in VariableManager get_vars() 26264 1727204274.63861: Calling all_inventory to load vars for managed-node3 26264 1727204274.63867: Calling groups_inventory to load vars for managed-node3 26264 1727204274.63869: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204274.63883: Calling all_plugins_play to load vars for managed-node3 26264 1727204274.63887: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204274.63890: Calling groups_plugins_play to load vars for managed-node3 26264 1727204274.64870: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000065 26264 1727204274.64875: WORKER PROCESS EXITING 26264 1727204274.67207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204274.72434: done with get_vars() 26264 1727204274.72574: done getting variables 26264 1727204274.72639: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:54 -0400 (0:00:00.138) 0:00:38.575 ***** 26264 1727204274.72791: entering _queue_task() for managed-node3/package 26264 1727204274.73480: worker is 1 (out of 1 available) 26264 1727204274.73494: exiting _queue_task() for managed-node3/package 26264 1727204274.73507: done queuing things up, now waiting for results queue to drain 26264 1727204274.73509: waiting for pending results... 26264 1727204274.74953: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 26264 1727204274.75412: in run() - task 0affcd87-79f5-5ff5-08b0-000000000066 26264 1727204274.75434: variable 'ansible_search_path' from source: unknown 26264 1727204274.75577: variable 'ansible_search_path' from source: unknown 26264 1727204274.75625: calling self._execute() 26264 1727204274.75978: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204274.75991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204274.76074: variable 'omit' from source: magic vars 26264 1727204274.77390: variable 'ansible_distribution_major_version' from source: facts 26264 1727204274.77411: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204274.77875: variable 'network_state' from source: role '' defaults 26264 1727204274.77894: Evaluated conditional (network_state != {}): False 26264 1727204274.77902: when evaluation is False, skipping this task 26264 1727204274.77920: _execute() done 26264 1727204274.78032: dumping result to json 26264 1727204274.78041: done dumping result, returning 26264 1727204274.78056: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-5ff5-08b0-000000000066] 26264 1727204274.78143: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000066 26264 1727204274.78276: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000066 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204274.78327: no more pending results, returning what we have 26264 1727204274.78331: results queue empty 26264 1727204274.78332: checking for any_errors_fatal 26264 1727204274.78340: done checking for any_errors_fatal 26264 1727204274.78340: checking for max_fail_percentage 26264 1727204274.78342: done checking for max_fail_percentage 26264 1727204274.78343: checking to see if all hosts have failed and the running result is not ok 26264 1727204274.78344: done checking to see if all hosts have failed 26264 1727204274.78345: getting the remaining hosts for this loop 26264 1727204274.78346: done getting the remaining hosts for this loop 26264 1727204274.78350: getting the next task for host managed-node3 26264 1727204274.78357: done getting next task for host managed-node3 26264 1727204274.78361: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 26264 1727204274.78362: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204274.78379: getting variables 26264 1727204274.78381: in VariableManager get_vars() 26264 1727204274.78423: Calling all_inventory to load vars for managed-node3 26264 1727204274.78428: Calling groups_inventory to load vars for managed-node3 26264 1727204274.78431: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204274.78445: Calling all_plugins_play to load vars for managed-node3 26264 1727204274.78449: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204274.78453: Calling groups_plugins_play to load vars for managed-node3 26264 1727204274.80172: WORKER PROCESS EXITING 26264 1727204274.81875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204274.86994: done with get_vars() 26264 1727204274.87031: done getting variables 26264 1727204274.87297: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:54 -0400 (0:00:00.146) 0:00:38.722 ***** 26264 1727204274.87329: entering _queue_task() for managed-node3/service 26264 1727204274.87861: worker is 1 (out of 1 available) 26264 1727204274.88090: exiting _queue_task() for managed-node3/service 26264 1727204274.88105: done queuing things up, now waiting for results queue to drain 26264 1727204274.88107: waiting for pending results... 26264 1727204274.89325: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 26264 1727204274.89456: in run() - task 0affcd87-79f5-5ff5-08b0-000000000067 26264 1727204274.89535: variable 'ansible_search_path' from source: unknown 26264 1727204274.89634: variable 'ansible_search_path' from source: unknown 26264 1727204274.89683: calling self._execute() 26264 1727204274.89902: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204274.89914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204274.89928: variable 'omit' from source: magic vars 26264 1727204274.90947: variable 'ansible_distribution_major_version' from source: facts 26264 1727204274.90973: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204274.91219: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204274.91667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204274.97762: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204274.97953: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204274.98028: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204274.98133: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204274.98196: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204274.98401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.98459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.98646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.98699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.98718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.98798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.98954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.98992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.99038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.99182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.99228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204274.99259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204274.99308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204274.99425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204274.99443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204274.99994: variable 'network_connections' from source: play vars 26264 1727204275.00067: variable 'profile' from source: play vars 26264 1727204275.00262: variable 'profile' from source: play vars 26264 1727204275.00276: variable 'interface' from source: set_fact 26264 1727204275.00343: variable 'interface' from source: set_fact 26264 1727204275.00547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204275.00928: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204275.00975: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204275.01166: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204275.01199: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204275.01283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204275.01314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204275.01383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204275.01486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204275.01540: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204275.02222: variable 'network_connections' from source: play vars 26264 1727204275.02239: variable 'profile' from source: play vars 26264 1727204275.02398: variable 'profile' from source: play vars 26264 1727204275.02437: variable 'interface' from source: set_fact 26264 1727204275.02513: variable 'interface' from source: set_fact 26264 1727204275.02685: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26264 1727204275.02693: when evaluation is False, skipping this task 26264 1727204275.02702: _execute() done 26264 1727204275.02711: dumping result to json 26264 1727204275.02722: done dumping result, returning 26264 1727204275.02734: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-5ff5-08b0-000000000067] 26264 1727204275.02754: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000067 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26264 1727204275.03023: no more pending results, returning what we have 26264 1727204275.03028: results queue empty 26264 1727204275.03029: checking for any_errors_fatal 26264 1727204275.03036: done checking for any_errors_fatal 26264 1727204275.03037: checking for max_fail_percentage 26264 1727204275.03039: done checking for max_fail_percentage 26264 1727204275.03040: checking to see if all hosts have failed and the running result is not ok 26264 1727204275.03041: done checking to see if all hosts have failed 26264 1727204275.03042: getting the remaining hosts for this loop 26264 1727204275.03044: done getting the remaining hosts for this loop 26264 1727204275.03051: getting the next task for host managed-node3 26264 1727204275.03058: done getting next task for host managed-node3 26264 1727204275.03065: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 26264 1727204275.03068: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204275.03082: getting variables 26264 1727204275.03084: in VariableManager get_vars() 26264 1727204275.03126: Calling all_inventory to load vars for managed-node3 26264 1727204275.03129: Calling groups_inventory to load vars for managed-node3 26264 1727204275.03132: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204275.03144: Calling all_plugins_play to load vars for managed-node3 26264 1727204275.03151: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204275.03155: Calling groups_plugins_play to load vars for managed-node3 26264 1727204275.04362: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000067 26264 1727204275.04367: WORKER PROCESS EXITING 26264 1727204275.06722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204275.10673: done with get_vars() 26264 1727204275.11044: done getting variables 26264 1727204275.11112: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:55 -0400 (0:00:00.238) 0:00:38.960 ***** 26264 1727204275.11484: entering _queue_task() for managed-node3/service 26264 1727204275.12409: worker is 1 (out of 1 available) 26264 1727204275.12421: exiting _queue_task() for managed-node3/service 26264 1727204275.12437: done queuing things up, now waiting for results queue to drain 26264 1727204275.12439: waiting for pending results... 26264 1727204275.13558: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 26264 1727204275.13776: in run() - task 0affcd87-79f5-5ff5-08b0-000000000068 26264 1727204275.13886: variable 'ansible_search_path' from source: unknown 26264 1727204275.13892: variable 'ansible_search_path' from source: unknown 26264 1727204275.13928: calling self._execute() 26264 1727204275.14037: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204275.14402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204275.14418: variable 'omit' from source: magic vars 26264 1727204275.18078: variable 'ansible_distribution_major_version' from source: facts 26264 1727204275.18289: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204275.19034: variable 'network_provider' from source: set_fact 26264 1727204275.19291: variable 'network_state' from source: role '' defaults 26264 1727204275.19342: Evaluated conditional (network_provider == "nm" or network_state != {}): True 26264 1727204275.19371: variable 'omit' from source: magic vars 26264 1727204275.19519: variable 'omit' from source: magic vars 26264 1727204275.19709: variable 'network_service_name' from source: role '' defaults 26264 1727204275.19961: variable 'network_service_name' from source: role '' defaults 26264 1727204275.20182: variable '__network_provider_setup' from source: role '' defaults 26264 1727204275.20284: variable '__network_service_name_default_nm' from source: role '' defaults 26264 1727204275.20353: variable '__network_service_name_default_nm' from source: role '' defaults 26264 1727204275.20579: variable '__network_packages_default_nm' from source: role '' defaults 26264 1727204275.20646: variable '__network_packages_default_nm' from source: role '' defaults 26264 1727204275.21186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204275.27346: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204275.27568: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204275.27666: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204275.27873: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204275.27907: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204275.28053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204275.28171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204275.28294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204275.28413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204275.28435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204275.28651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204275.28755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204275.28799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204275.28901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204275.29014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204275.29580: variable '__network_packages_default_gobject_packages' from source: role '' defaults 26264 1727204275.30012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204275.30111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204275.30228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204275.30337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204275.30391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204275.30768: variable 'ansible_python' from source: facts 26264 1727204275.30797: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 26264 1727204275.30979: variable '__network_wpa_supplicant_required' from source: role '' defaults 26264 1727204275.31395: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26264 1727204275.31531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204275.31567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204275.31597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204275.31725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204275.31746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204275.31975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204275.32035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204275.32125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204275.32221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204275.32305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204275.32718: variable 'network_connections' from source: play vars 26264 1727204275.32733: variable 'profile' from source: play vars 26264 1727204275.32908: variable 'profile' from source: play vars 26264 1727204275.32979: variable 'interface' from source: set_fact 26264 1727204275.33203: variable 'interface' from source: set_fact 26264 1727204275.33351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204275.33875: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204275.34112: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204275.34394: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204275.34461: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204275.34556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204275.34727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204275.34792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204275.34974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204275.35030: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204275.35731: variable 'network_connections' from source: play vars 26264 1727204275.35968: variable 'profile' from source: play vars 26264 1727204275.36052: variable 'profile' from source: play vars 26264 1727204275.36066: variable 'interface' from source: set_fact 26264 1727204275.36138: variable 'interface' from source: set_fact 26264 1727204275.36326: variable '__network_packages_default_wireless' from source: role '' defaults 26264 1727204275.36510: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204275.36915: variable 'network_connections' from source: play vars 26264 1727204275.36937: variable 'profile' from source: play vars 26264 1727204275.37029: variable 'profile' from source: play vars 26264 1727204275.37039: variable 'interface' from source: set_fact 26264 1727204275.37125: variable 'interface' from source: set_fact 26264 1727204275.37173: variable '__network_packages_default_team' from source: role '' defaults 26264 1727204275.37257: variable '__network_team_connections_defined' from source: role '' defaults 26264 1727204275.37668: variable 'network_connections' from source: play vars 26264 1727204275.37689: variable 'profile' from source: play vars 26264 1727204275.37809: variable 'profile' from source: play vars 26264 1727204275.37820: variable 'interface' from source: set_fact 26264 1727204275.37937: variable 'interface' from source: set_fact 26264 1727204275.38036: variable '__network_service_name_default_initscripts' from source: role '' defaults 26264 1727204275.38106: variable '__network_service_name_default_initscripts' from source: role '' defaults 26264 1727204275.38117: variable '__network_packages_default_initscripts' from source: role '' defaults 26264 1727204275.38186: variable '__network_packages_default_initscripts' from source: role '' defaults 26264 1727204275.38462: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 26264 1727204275.39301: variable 'network_connections' from source: play vars 26264 1727204275.39312: variable 'profile' from source: play vars 26264 1727204275.39424: variable 'profile' from source: play vars 26264 1727204275.39433: variable 'interface' from source: set_fact 26264 1727204275.39600: variable 'interface' from source: set_fact 26264 1727204275.39615: variable 'ansible_distribution' from source: facts 26264 1727204275.39623: variable '__network_rh_distros' from source: role '' defaults 26264 1727204275.39634: variable 'ansible_distribution_major_version' from source: facts 26264 1727204275.39653: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 26264 1727204275.39906: variable 'ansible_distribution' from source: facts 26264 1727204275.39915: variable '__network_rh_distros' from source: role '' defaults 26264 1727204275.39925: variable 'ansible_distribution_major_version' from source: facts 26264 1727204275.39940: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 26264 1727204275.40282: variable 'ansible_distribution' from source: facts 26264 1727204275.40293: variable '__network_rh_distros' from source: role '' defaults 26264 1727204275.40326: variable 'ansible_distribution_major_version' from source: facts 26264 1727204275.40369: variable 'network_provider' from source: set_fact 26264 1727204275.40538: variable 'omit' from source: magic vars 26264 1727204275.40574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204275.40608: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204275.40633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204275.40687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204275.40767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204275.40801: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204275.40808: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204275.40815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204275.41034: Set connection var ansible_pipelining to False 26264 1727204275.41042: Set connection var ansible_connection to ssh 26264 1727204275.41049: Set connection var ansible_shell_type to sh 26264 1727204275.41059: Set connection var ansible_shell_executable to /bin/sh 26264 1727204275.41091: Set connection var ansible_timeout to 10 26264 1727204275.41180: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204275.41216: variable 'ansible_shell_executable' from source: unknown 26264 1727204275.41223: variable 'ansible_connection' from source: unknown 26264 1727204275.41231: variable 'ansible_module_compression' from source: unknown 26264 1727204275.41237: variable 'ansible_shell_type' from source: unknown 26264 1727204275.41243: variable 'ansible_shell_executable' from source: unknown 26264 1727204275.41250: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204275.41266: variable 'ansible_pipelining' from source: unknown 26264 1727204275.41273: variable 'ansible_timeout' from source: unknown 26264 1727204275.41280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204275.41591: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204275.41650: variable 'omit' from source: magic vars 26264 1727204275.41661: starting attempt loop 26264 1727204275.41747: running the handler 26264 1727204275.41847: variable 'ansible_facts' from source: unknown 26264 1727204275.43604: _low_level_execute_command(): starting 26264 1727204275.43685: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204275.44926: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204275.44931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204275.44965: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204275.44969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204275.44971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204275.44974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204275.45049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204275.45052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204275.45055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204275.45126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204275.46755: stdout chunk (state=3): >>>/root <<< 26264 1727204275.46854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204275.46957: stderr chunk (state=3): >>><<< 26264 1727204275.46970: stdout chunk (state=3): >>><<< 26264 1727204275.47037: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204275.47041: _low_level_execute_command(): starting 26264 1727204275.47088: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204275.4701562-29383-39592026847434 `" && echo ansible-tmp-1727204275.4701562-29383-39592026847434="` echo /root/.ansible/tmp/ansible-tmp-1727204275.4701562-29383-39592026847434 `" ) && sleep 0' 26264 1727204275.48855: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204275.48860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204275.49009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204275.49014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204275.49016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204275.49188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204275.49201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204275.49262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204275.51081: stdout chunk (state=3): >>>ansible-tmp-1727204275.4701562-29383-39592026847434=/root/.ansible/tmp/ansible-tmp-1727204275.4701562-29383-39592026847434 <<< 26264 1727204275.51195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204275.51281: stderr chunk (state=3): >>><<< 26264 1727204275.51285: stdout chunk (state=3): >>><<< 26264 1727204275.51672: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204275.4701562-29383-39592026847434=/root/.ansible/tmp/ansible-tmp-1727204275.4701562-29383-39592026847434 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204275.51676: variable 'ansible_module_compression' from source: unknown 26264 1727204275.51678: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 26264 1727204275.51680: variable 'ansible_facts' from source: unknown 26264 1727204275.51686: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204275.4701562-29383-39592026847434/AnsiballZ_systemd.py 26264 1727204275.52297: Sending initial data 26264 1727204275.52301: Sent initial data (155 bytes) 26264 1727204275.54735: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204275.54754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204275.54774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204275.54795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204275.54849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204275.54932: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204275.54949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204275.54971: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204275.54984: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204275.54995: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204275.55008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204275.55021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204275.55038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204275.55053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204275.55067: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204275.55089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204275.55303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204275.55321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204275.55337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204275.55493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204275.57113: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204275.57152: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204275.57193: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpxy6odor4 /root/.ansible/tmp/ansible-tmp-1727204275.4701562-29383-39592026847434/AnsiballZ_systemd.py <<< 26264 1727204275.57246: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204275.60478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204275.60482: stderr chunk (state=3): >>><<< 26264 1727204275.60485: stdout chunk (state=3): >>><<< 26264 1727204275.60487: done transferring module to remote 26264 1727204275.60489: _low_level_execute_command(): starting 26264 1727204275.60492: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204275.4701562-29383-39592026847434/ /root/.ansible/tmp/ansible-tmp-1727204275.4701562-29383-39592026847434/AnsiballZ_systemd.py && sleep 0' 26264 1727204275.61051: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204275.61069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204275.61085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204275.61102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204275.61144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204275.61156: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204275.61176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204275.61194: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204275.61206: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204275.61216: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204275.61228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204275.61241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204275.61256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204275.61270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204275.61281: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204275.61296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204275.61377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204275.61396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204275.61410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204275.61480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204275.63219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204275.63223: stdout chunk (state=3): >>><<< 26264 1727204275.63230: stderr chunk (state=3): >>><<< 26264 1727204275.63244: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204275.63247: _low_level_execute_command(): starting 26264 1727204275.63252: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204275.4701562-29383-39592026847434/AnsiballZ_systemd.py && sleep 0' 26264 1727204275.63896: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204275.63904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204275.63913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204275.63927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204275.63966: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204275.63981: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204275.63990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204275.64004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204275.64011: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204275.64018: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204275.64027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204275.64036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204275.64051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204275.64054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204275.64062: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204275.64074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204275.64151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204275.64168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204275.64180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204275.64263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204275.88867: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 26264 1727204275.88888: stdout chunk (state=3): >>>service", "ControlGroupId": "2418", "MemoryCurrent": "16162816", "MemoryAvailable": "infinity", "CPUUsageNSec": "1560139000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 26264 1727204275.90283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204275.90383: stderr chunk (state=3): >>><<< 26264 1727204275.90387: stdout chunk (state=3): >>><<< 26264 1727204275.90473: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "16162816", "MemoryAvailable": "infinity", "CPUUsageNSec": "1560139000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204275.90680: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204275.4701562-29383-39592026847434/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204275.90683: _low_level_execute_command(): starting 26264 1727204275.90686: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204275.4701562-29383-39592026847434/ > /dev/null 2>&1 && sleep 0' 26264 1727204275.91579: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204275.91582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204275.91618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204275.91621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204275.91623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204275.91709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204275.91722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204275.91797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204275.93530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204275.93610: stderr chunk (state=3): >>><<< 26264 1727204275.93614: stdout chunk (state=3): >>><<< 26264 1727204275.93971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204275.93974: handler run complete 26264 1727204275.93977: attempt loop complete, returning result 26264 1727204275.93979: _execute() done 26264 1727204275.93981: dumping result to json 26264 1727204275.93983: done dumping result, returning 26264 1727204275.93985: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-5ff5-08b0-000000000068] 26264 1727204275.93987: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000068 26264 1727204275.94136: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000068 26264 1727204275.94139: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204275.94193: no more pending results, returning what we have 26264 1727204275.94200: results queue empty 26264 1727204275.94201: checking for any_errors_fatal 26264 1727204275.94217: done checking for any_errors_fatal 26264 1727204275.94219: checking for max_fail_percentage 26264 1727204275.94221: done checking for max_fail_percentage 26264 1727204275.94222: checking to see if all hosts have failed and the running result is not ok 26264 1727204275.94223: done checking to see if all hosts have failed 26264 1727204275.94224: getting the remaining hosts for this loop 26264 1727204275.94226: done getting the remaining hosts for this loop 26264 1727204275.94231: getting the next task for host managed-node3 26264 1727204275.94238: done getting next task for host managed-node3 26264 1727204275.94242: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 26264 1727204275.94244: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204275.94255: getting variables 26264 1727204275.94257: in VariableManager get_vars() 26264 1727204275.94300: Calling all_inventory to load vars for managed-node3 26264 1727204275.94303: Calling groups_inventory to load vars for managed-node3 26264 1727204275.94306: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204275.94317: Calling all_plugins_play to load vars for managed-node3 26264 1727204275.94322: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204275.94326: Calling groups_plugins_play to load vars for managed-node3 26264 1727204275.96103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204275.98307: done with get_vars() 26264 1727204275.98341: done getting variables 26264 1727204275.98412: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:55 -0400 (0:00:00.872) 0:00:39.833 ***** 26264 1727204275.98447: entering _queue_task() for managed-node3/service 26264 1727204275.98802: worker is 1 (out of 1 available) 26264 1727204275.98814: exiting _queue_task() for managed-node3/service 26264 1727204275.98828: done queuing things up, now waiting for results queue to drain 26264 1727204275.98829: waiting for pending results... 26264 1727204275.99791: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 26264 1727204275.99878: in run() - task 0affcd87-79f5-5ff5-08b0-000000000069 26264 1727204275.99889: variable 'ansible_search_path' from source: unknown 26264 1727204275.99892: variable 'ansible_search_path' from source: unknown 26264 1727204275.99928: calling self._execute() 26264 1727204276.00004: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204276.00010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204276.00024: variable 'omit' from source: magic vars 26264 1727204276.00315: variable 'ansible_distribution_major_version' from source: facts 26264 1727204276.00327: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204276.00411: variable 'network_provider' from source: set_fact 26264 1727204276.00415: Evaluated conditional (network_provider == "nm"): True 26264 1727204276.00483: variable '__network_wpa_supplicant_required' from source: role '' defaults 26264 1727204276.00547: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26264 1727204276.00676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204276.03551: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204276.03640: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204276.03692: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204276.03743: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204276.03781: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204276.03880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204276.03919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204276.03972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204276.04022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204276.04046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204276.04108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204276.04133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204276.04184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204276.04226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204276.04250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204276.04306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204276.04332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204276.04361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204276.04408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204276.04421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204276.04572: variable 'network_connections' from source: play vars 26264 1727204276.04582: variable 'profile' from source: play vars 26264 1727204276.04654: variable 'profile' from source: play vars 26264 1727204276.04658: variable 'interface' from source: set_fact 26264 1727204276.04717: variable 'interface' from source: set_fact 26264 1727204276.05093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26264 1727204276.05096: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26264 1727204276.05098: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26264 1727204276.05100: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26264 1727204276.05102: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26264 1727204276.05105: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26264 1727204276.05107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26264 1727204276.05125: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204276.05153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26264 1727204276.05198: variable '__network_wireless_connections_defined' from source: role '' defaults 26264 1727204276.05431: variable 'network_connections' from source: play vars 26264 1727204276.05435: variable 'profile' from source: play vars 26264 1727204276.05496: variable 'profile' from source: play vars 26264 1727204276.05499: variable 'interface' from source: set_fact 26264 1727204276.05556: variable 'interface' from source: set_fact 26264 1727204276.05586: Evaluated conditional (__network_wpa_supplicant_required): False 26264 1727204276.05589: when evaluation is False, skipping this task 26264 1727204276.05592: _execute() done 26264 1727204276.05602: dumping result to json 26264 1727204276.05604: done dumping result, returning 26264 1727204276.05607: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-5ff5-08b0-000000000069] 26264 1727204276.05611: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000069 26264 1727204276.05704: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000069 26264 1727204276.05708: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 26264 1727204276.05778: no more pending results, returning what we have 26264 1727204276.05782: results queue empty 26264 1727204276.05783: checking for any_errors_fatal 26264 1727204276.05800: done checking for any_errors_fatal 26264 1727204276.05801: checking for max_fail_percentage 26264 1727204276.05803: done checking for max_fail_percentage 26264 1727204276.05804: checking to see if all hosts have failed and the running result is not ok 26264 1727204276.05805: done checking to see if all hosts have failed 26264 1727204276.05805: getting the remaining hosts for this loop 26264 1727204276.05807: done getting the remaining hosts for this loop 26264 1727204276.05811: getting the next task for host managed-node3 26264 1727204276.05816: done getting next task for host managed-node3 26264 1727204276.05820: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 26264 1727204276.05822: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204276.05836: getting variables 26264 1727204276.05838: in VariableManager get_vars() 26264 1727204276.05883: Calling all_inventory to load vars for managed-node3 26264 1727204276.05886: Calling groups_inventory to load vars for managed-node3 26264 1727204276.05889: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204276.05898: Calling all_plugins_play to load vars for managed-node3 26264 1727204276.05901: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204276.05904: Calling groups_plugins_play to load vars for managed-node3 26264 1727204276.07771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204276.09782: done with get_vars() 26264 1727204276.09816: done getting variables 26264 1727204276.09885: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:56 -0400 (0:00:00.114) 0:00:39.948 ***** 26264 1727204276.09921: entering _queue_task() for managed-node3/service 26264 1727204276.10267: worker is 1 (out of 1 available) 26264 1727204276.10280: exiting _queue_task() for managed-node3/service 26264 1727204276.10293: done queuing things up, now waiting for results queue to drain 26264 1727204276.10295: waiting for pending results... 26264 1727204276.10586: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 26264 1727204276.10719: in run() - task 0affcd87-79f5-5ff5-08b0-00000000006a 26264 1727204276.10732: variable 'ansible_search_path' from source: unknown 26264 1727204276.10740: variable 'ansible_search_path' from source: unknown 26264 1727204276.10779: calling self._execute() 26264 1727204276.10934: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204276.10938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204276.10952: variable 'omit' from source: magic vars 26264 1727204276.11425: variable 'ansible_distribution_major_version' from source: facts 26264 1727204276.11439: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204276.11568: variable 'network_provider' from source: set_fact 26264 1727204276.11574: Evaluated conditional (network_provider == "initscripts"): False 26264 1727204276.11577: when evaluation is False, skipping this task 26264 1727204276.11580: _execute() done 26264 1727204276.11582: dumping result to json 26264 1727204276.11587: done dumping result, returning 26264 1727204276.11594: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-5ff5-08b0-00000000006a] 26264 1727204276.11601: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000006a 26264 1727204276.11701: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000006a 26264 1727204276.11704: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26264 1727204276.11762: no more pending results, returning what we have 26264 1727204276.11769: results queue empty 26264 1727204276.11770: checking for any_errors_fatal 26264 1727204276.11780: done checking for any_errors_fatal 26264 1727204276.11781: checking for max_fail_percentage 26264 1727204276.11783: done checking for max_fail_percentage 26264 1727204276.11783: checking to see if all hosts have failed and the running result is not ok 26264 1727204276.11784: done checking to see if all hosts have failed 26264 1727204276.11785: getting the remaining hosts for this loop 26264 1727204276.11787: done getting the remaining hosts for this loop 26264 1727204276.11791: getting the next task for host managed-node3 26264 1727204276.11798: done getting next task for host managed-node3 26264 1727204276.11802: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 26264 1727204276.11805: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204276.11820: getting variables 26264 1727204276.11822: in VariableManager get_vars() 26264 1727204276.11862: Calling all_inventory to load vars for managed-node3 26264 1727204276.11867: Calling groups_inventory to load vars for managed-node3 26264 1727204276.11870: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204276.11884: Calling all_plugins_play to load vars for managed-node3 26264 1727204276.11888: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204276.11891: Calling groups_plugins_play to load vars for managed-node3 26264 1727204276.15046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204276.18855: done with get_vars() 26264 1727204276.18886: done getting variables 26264 1727204276.18958: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:56 -0400 (0:00:00.090) 0:00:40.039 ***** 26264 1727204276.19000: entering _queue_task() for managed-node3/copy 26264 1727204276.19409: worker is 1 (out of 1 available) 26264 1727204276.19422: exiting _queue_task() for managed-node3/copy 26264 1727204276.19441: done queuing things up, now waiting for results queue to drain 26264 1727204276.19443: waiting for pending results... 26264 1727204276.19754: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 26264 1727204276.19874: in run() - task 0affcd87-79f5-5ff5-08b0-00000000006b 26264 1727204276.19894: variable 'ansible_search_path' from source: unknown 26264 1727204276.19898: variable 'ansible_search_path' from source: unknown 26264 1727204276.19940: calling self._execute() 26264 1727204276.20417: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204276.20428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204276.20441: variable 'omit' from source: magic vars 26264 1727204276.21533: variable 'ansible_distribution_major_version' from source: facts 26264 1727204276.21549: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204276.21768: variable 'network_provider' from source: set_fact 26264 1727204276.21788: Evaluated conditional (network_provider == "initscripts"): False 26264 1727204276.21792: when evaluation is False, skipping this task 26264 1727204276.21795: _execute() done 26264 1727204276.21797: dumping result to json 26264 1727204276.21828: done dumping result, returning 26264 1727204276.21831: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-5ff5-08b0-00000000006b] 26264 1727204276.21833: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000006b 26264 1727204276.22011: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000006b 26264 1727204276.22015: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 26264 1727204276.22076: no more pending results, returning what we have 26264 1727204276.22081: results queue empty 26264 1727204276.22082: checking for any_errors_fatal 26264 1727204276.22086: done checking for any_errors_fatal 26264 1727204276.22087: checking for max_fail_percentage 26264 1727204276.22089: done checking for max_fail_percentage 26264 1727204276.22090: checking to see if all hosts have failed and the running result is not ok 26264 1727204276.22091: done checking to see if all hosts have failed 26264 1727204276.22092: getting the remaining hosts for this loop 26264 1727204276.22094: done getting the remaining hosts for this loop 26264 1727204276.22099: getting the next task for host managed-node3 26264 1727204276.22106: done getting next task for host managed-node3 26264 1727204276.22110: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 26264 1727204276.22113: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204276.22130: getting variables 26264 1727204276.22132: in VariableManager get_vars() 26264 1727204276.22196: Calling all_inventory to load vars for managed-node3 26264 1727204276.22200: Calling groups_inventory to load vars for managed-node3 26264 1727204276.22206: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204276.22227: Calling all_plugins_play to load vars for managed-node3 26264 1727204276.22236: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204276.22240: Calling groups_plugins_play to load vars for managed-node3 26264 1727204276.24731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204276.27669: done with get_vars() 26264 1727204276.27704: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:56 -0400 (0:00:00.088) 0:00:40.127 ***** 26264 1727204276.27821: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 26264 1727204276.28438: worker is 1 (out of 1 available) 26264 1727204276.28460: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 26264 1727204276.28492: done queuing things up, now waiting for results queue to drain 26264 1727204276.28494: waiting for pending results... 26264 1727204276.29028: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 26264 1727204276.29209: in run() - task 0affcd87-79f5-5ff5-08b0-00000000006c 26264 1727204276.29226: variable 'ansible_search_path' from source: unknown 26264 1727204276.29230: variable 'ansible_search_path' from source: unknown 26264 1727204276.29284: calling self._execute() 26264 1727204276.29439: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204276.29443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204276.29457: variable 'omit' from source: magic vars 26264 1727204276.30005: variable 'ansible_distribution_major_version' from source: facts 26264 1727204276.30018: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204276.30029: variable 'omit' from source: magic vars 26264 1727204276.30078: variable 'omit' from source: magic vars 26264 1727204276.30357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26264 1727204276.33608: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26264 1727204276.33698: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26264 1727204276.33735: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26264 1727204276.33812: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26264 1727204276.33857: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26264 1727204276.33946: variable 'network_provider' from source: set_fact 26264 1727204276.34099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26264 1727204276.34157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26264 1727204276.34190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26264 1727204276.34239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26264 1727204276.34263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26264 1727204276.34357: variable 'omit' from source: magic vars 26264 1727204276.34499: variable 'omit' from source: magic vars 26264 1727204276.34619: variable 'network_connections' from source: play vars 26264 1727204276.34630: variable 'profile' from source: play vars 26264 1727204276.34741: variable 'profile' from source: play vars 26264 1727204276.34745: variable 'interface' from source: set_fact 26264 1727204276.34825: variable 'interface' from source: set_fact 26264 1727204276.35039: variable 'omit' from source: magic vars 26264 1727204276.35042: variable '__lsr_ansible_managed' from source: task vars 26264 1727204276.35118: variable '__lsr_ansible_managed' from source: task vars 26264 1727204276.35768: Loaded config def from plugin (lookup/template) 26264 1727204276.35772: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 26264 1727204276.35795: File lookup term: get_ansible_managed.j2 26264 1727204276.35798: variable 'ansible_search_path' from source: unknown 26264 1727204276.35803: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 26264 1727204276.35814: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 26264 1727204276.35827: variable 'ansible_search_path' from source: unknown 26264 1727204276.45508: variable 'ansible_managed' from source: unknown 26264 1727204276.45659: variable 'omit' from source: magic vars 26264 1727204276.45687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204276.45711: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204276.45724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204276.45739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204276.45748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204276.45770: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204276.45773: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204276.45776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204276.45867: Set connection var ansible_pipelining to False 26264 1727204276.45870: Set connection var ansible_connection to ssh 26264 1727204276.45872: Set connection var ansible_shell_type to sh 26264 1727204276.45878: Set connection var ansible_shell_executable to /bin/sh 26264 1727204276.45886: Set connection var ansible_timeout to 10 26264 1727204276.45893: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204276.45916: variable 'ansible_shell_executable' from source: unknown 26264 1727204276.45919: variable 'ansible_connection' from source: unknown 26264 1727204276.45922: variable 'ansible_module_compression' from source: unknown 26264 1727204276.45924: variable 'ansible_shell_type' from source: unknown 26264 1727204276.45927: variable 'ansible_shell_executable' from source: unknown 26264 1727204276.45929: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204276.45934: variable 'ansible_pipelining' from source: unknown 26264 1727204276.45937: variable 'ansible_timeout' from source: unknown 26264 1727204276.45939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204276.46057: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204276.46069: variable 'omit' from source: magic vars 26264 1727204276.46072: starting attempt loop 26264 1727204276.46075: running the handler 26264 1727204276.46084: _low_level_execute_command(): starting 26264 1727204276.46089: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204276.46770: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204276.46783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204276.46790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204276.46803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204276.46844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204276.46850: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204276.46863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204276.46880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204276.46887: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204276.46894: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204276.46901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204276.46910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204276.46927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204276.46930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204276.46932: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204276.46942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204276.47043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204276.47046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204276.47048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204276.47635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204276.48743: stdout chunk (state=3): >>>/root <<< 26264 1727204276.48850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204276.48899: stderr chunk (state=3): >>><<< 26264 1727204276.48903: stdout chunk (state=3): >>><<< 26264 1727204276.48926: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204276.48935: _low_level_execute_command(): starting 26264 1727204276.48941: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204276.4892616-29429-1982975650543 `" && echo ansible-tmp-1727204276.4892616-29429-1982975650543="` echo /root/.ansible/tmp/ansible-tmp-1727204276.4892616-29429-1982975650543 `" ) && sleep 0' 26264 1727204276.49400: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204276.49406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204276.49434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204276.49441: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204276.49448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204276.49464: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204276.49472: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204276.49479: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204276.49484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204276.49492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204276.49501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204276.49506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204276.49557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204276.49583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204276.49632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204276.51518: stdout chunk (state=3): >>>ansible-tmp-1727204276.4892616-29429-1982975650543=/root/.ansible/tmp/ansible-tmp-1727204276.4892616-29429-1982975650543 <<< 26264 1727204276.51831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204276.51834: stdout chunk (state=3): >>><<< 26264 1727204276.51836: stderr chunk (state=3): >>><<< 26264 1727204276.51839: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204276.4892616-29429-1982975650543=/root/.ansible/tmp/ansible-tmp-1727204276.4892616-29429-1982975650543 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204276.51845: variable 'ansible_module_compression' from source: unknown 26264 1727204276.51847: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 26264 1727204276.51849: variable 'ansible_facts' from source: unknown 26264 1727204276.51939: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204276.4892616-29429-1982975650543/AnsiballZ_network_connections.py 26264 1727204276.52100: Sending initial data 26264 1727204276.52103: Sent initial data (166 bytes) 26264 1727204276.53715: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204276.53724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204276.53735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204276.53755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204276.53798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204276.53802: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204276.53816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204276.53829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204276.53836: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204276.53843: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204276.53854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204276.53862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204276.53876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204276.53884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204276.53894: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204276.53899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204276.54030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204276.54102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204276.54114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204276.54182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204276.55846: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204276.55878: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204276.55915: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpsaj4cqik /root/.ansible/tmp/ansible-tmp-1727204276.4892616-29429-1982975650543/AnsiballZ_network_connections.py <<< 26264 1727204276.55955: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204276.57317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204276.57432: stderr chunk (state=3): >>><<< 26264 1727204276.57435: stdout chunk (state=3): >>><<< 26264 1727204276.57456: done transferring module to remote 26264 1727204276.57482: _low_level_execute_command(): starting 26264 1727204276.57488: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204276.4892616-29429-1982975650543/ /root/.ansible/tmp/ansible-tmp-1727204276.4892616-29429-1982975650543/AnsiballZ_network_connections.py && sleep 0' 26264 1727204276.58232: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204276.58254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204276.58257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204276.58275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204276.58310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204276.58317: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204276.58332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204276.58345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204276.58354: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204276.58379: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204276.58382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204276.58399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204276.58402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204276.58405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204276.58424: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204276.58427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204276.58506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204276.58523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204276.58532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204276.58605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204276.60293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204276.60358: stderr chunk (state=3): >>><<< 26264 1727204276.60361: stdout chunk (state=3): >>><<< 26264 1727204276.60377: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204276.60385: _low_level_execute_command(): starting 26264 1727204276.60388: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204276.4892616-29429-1982975650543/AnsiballZ_network_connections.py && sleep 0' 26264 1727204276.60855: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204276.60862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204276.60897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204276.60902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204276.60913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204276.60918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204276.60923: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204276.60932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204276.60995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204276.61001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204276.61009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204276.61080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204276.84456: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uls0c6kl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uls0c6kl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/28a85e21-2916-4a1f-bfbf-d47ba3d76f0d: error=unknown <<< 26264 1727204276.84606: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 26264 1727204276.86009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204276.86102: stderr chunk (state=3): >>><<< 26264 1727204276.86106: stdout chunk (state=3): >>><<< 26264 1727204276.86263: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uls0c6kl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uls0c6kl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/28a85e21-2916-4a1f-bfbf-d47ba3d76f0d: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204276.86269: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204276.4892616-29429-1982975650543/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204276.86272: _low_level_execute_command(): starting 26264 1727204276.86275: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204276.4892616-29429-1982975650543/ > /dev/null 2>&1 && sleep 0' 26264 1727204276.86944: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204276.86962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204276.86979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204276.86996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204276.87071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204276.87100: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204276.87121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204276.87155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204276.87172: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204276.87187: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204276.87204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204276.87227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204276.87259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204276.87283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204276.87302: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204276.87322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204276.87430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204276.87467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204276.87498: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204276.87577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204276.89430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204276.89437: stdout chunk (state=3): >>><<< 26264 1727204276.89439: stderr chunk (state=3): >>><<< 26264 1727204276.89944: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204276.89947: handler run complete 26264 1727204276.89950: attempt loop complete, returning result 26264 1727204276.89952: _execute() done 26264 1727204276.89953: dumping result to json 26264 1727204276.89955: done dumping result, returning 26264 1727204276.89957: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-5ff5-08b0-00000000006c] 26264 1727204276.89959: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000006c 26264 1727204276.90029: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000006c 26264 1727204276.90033: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 26264 1727204276.90122: no more pending results, returning what we have 26264 1727204276.90128: results queue empty 26264 1727204276.90129: checking for any_errors_fatal 26264 1727204276.90134: done checking for any_errors_fatal 26264 1727204276.90135: checking for max_fail_percentage 26264 1727204276.90136: done checking for max_fail_percentage 26264 1727204276.90137: checking to see if all hosts have failed and the running result is not ok 26264 1727204276.90138: done checking to see if all hosts have failed 26264 1727204276.90139: getting the remaining hosts for this loop 26264 1727204276.90140: done getting the remaining hosts for this loop 26264 1727204276.90144: getting the next task for host managed-node3 26264 1727204276.90149: done getting next task for host managed-node3 26264 1727204276.90152: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 26264 1727204276.90154: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204276.90165: getting variables 26264 1727204276.90167: in VariableManager get_vars() 26264 1727204276.90200: Calling all_inventory to load vars for managed-node3 26264 1727204276.90203: Calling groups_inventory to load vars for managed-node3 26264 1727204276.90206: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204276.90216: Calling all_plugins_play to load vars for managed-node3 26264 1727204276.90220: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204276.90222: Calling groups_plugins_play to load vars for managed-node3 26264 1727204276.91838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204276.93002: done with get_vars() 26264 1727204276.93034: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:56 -0400 (0:00:00.653) 0:00:40.780 ***** 26264 1727204276.93128: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 26264 1727204276.93412: worker is 1 (out of 1 available) 26264 1727204276.93428: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 26264 1727204276.93443: done queuing things up, now waiting for results queue to drain 26264 1727204276.93445: waiting for pending results... 26264 1727204276.93713: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 26264 1727204276.93815: in run() - task 0affcd87-79f5-5ff5-08b0-00000000006d 26264 1727204276.93827: variable 'ansible_search_path' from source: unknown 26264 1727204276.93830: variable 'ansible_search_path' from source: unknown 26264 1727204276.93858: calling self._execute() 26264 1727204276.93948: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204276.93952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204276.93955: variable 'omit' from source: magic vars 26264 1727204276.94334: variable 'ansible_distribution_major_version' from source: facts 26264 1727204276.94351: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204276.94444: variable 'network_state' from source: role '' defaults 26264 1727204276.94456: Evaluated conditional (network_state != {}): False 26264 1727204276.94459: when evaluation is False, skipping this task 26264 1727204276.94463: _execute() done 26264 1727204276.94467: dumping result to json 26264 1727204276.94469: done dumping result, returning 26264 1727204276.94474: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-5ff5-08b0-00000000006d] 26264 1727204276.94481: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000006d 26264 1727204276.94575: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000006d 26264 1727204276.94578: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26264 1727204276.94760: no more pending results, returning what we have 26264 1727204276.94765: results queue empty 26264 1727204276.94766: checking for any_errors_fatal 26264 1727204276.94805: done checking for any_errors_fatal 26264 1727204276.94806: checking for max_fail_percentage 26264 1727204276.94808: done checking for max_fail_percentage 26264 1727204276.94808: checking to see if all hosts have failed and the running result is not ok 26264 1727204276.94809: done checking to see if all hosts have failed 26264 1727204276.94809: getting the remaining hosts for this loop 26264 1727204276.94810: done getting the remaining hosts for this loop 26264 1727204276.94813: getting the next task for host managed-node3 26264 1727204276.94817: done getting next task for host managed-node3 26264 1727204276.94820: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 26264 1727204276.94822: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204276.94846: getting variables 26264 1727204276.94850: in VariableManager get_vars() 26264 1727204276.94888: Calling all_inventory to load vars for managed-node3 26264 1727204276.94893: Calling groups_inventory to load vars for managed-node3 26264 1727204276.94897: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204276.94909: Calling all_plugins_play to load vars for managed-node3 26264 1727204276.94912: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204276.94916: Calling groups_plugins_play to load vars for managed-node3 26264 1727204276.96351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204276.98078: done with get_vars() 26264 1727204276.98106: done getting variables 26264 1727204276.98176: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:56 -0400 (0:00:00.050) 0:00:40.831 ***** 26264 1727204276.98209: entering _queue_task() for managed-node3/debug 26264 1727204276.99034: worker is 1 (out of 1 available) 26264 1727204276.99048: exiting _queue_task() for managed-node3/debug 26264 1727204276.99060: done queuing things up, now waiting for results queue to drain 26264 1727204276.99062: waiting for pending results... 26264 1727204276.99375: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 26264 1727204276.99470: in run() - task 0affcd87-79f5-5ff5-08b0-00000000006e 26264 1727204276.99482: variable 'ansible_search_path' from source: unknown 26264 1727204276.99486: variable 'ansible_search_path' from source: unknown 26264 1727204276.99513: calling self._execute() 26264 1727204276.99593: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204276.99597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204276.99606: variable 'omit' from source: magic vars 26264 1727204276.99900: variable 'ansible_distribution_major_version' from source: facts 26264 1727204276.99910: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204276.99916: variable 'omit' from source: magic vars 26264 1727204276.99942: variable 'omit' from source: magic vars 26264 1727204276.99973: variable 'omit' from source: magic vars 26264 1727204277.00004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204277.00031: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204277.00050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204277.00068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204277.00076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204277.00098: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204277.00101: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204277.00103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204277.00173: Set connection var ansible_pipelining to False 26264 1727204277.00176: Set connection var ansible_connection to ssh 26264 1727204277.00180: Set connection var ansible_shell_type to sh 26264 1727204277.00182: Set connection var ansible_shell_executable to /bin/sh 26264 1727204277.00190: Set connection var ansible_timeout to 10 26264 1727204277.00196: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204277.00215: variable 'ansible_shell_executable' from source: unknown 26264 1727204277.00217: variable 'ansible_connection' from source: unknown 26264 1727204277.00220: variable 'ansible_module_compression' from source: unknown 26264 1727204277.00222: variable 'ansible_shell_type' from source: unknown 26264 1727204277.00224: variable 'ansible_shell_executable' from source: unknown 26264 1727204277.00227: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204277.00230: variable 'ansible_pipelining' from source: unknown 26264 1727204277.00233: variable 'ansible_timeout' from source: unknown 26264 1727204277.00237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204277.00339: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204277.00352: variable 'omit' from source: magic vars 26264 1727204277.00355: starting attempt loop 26264 1727204277.00358: running the handler 26264 1727204277.00452: variable '__network_connections_result' from source: set_fact 26264 1727204277.00495: handler run complete 26264 1727204277.00506: attempt loop complete, returning result 26264 1727204277.00510: _execute() done 26264 1727204277.00513: dumping result to json 26264 1727204277.00515: done dumping result, returning 26264 1727204277.00521: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-5ff5-08b0-00000000006e] 26264 1727204277.00526: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000006e ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 26264 1727204277.00667: no more pending results, returning what we have 26264 1727204277.00671: results queue empty 26264 1727204277.00672: checking for any_errors_fatal 26264 1727204277.00677: done checking for any_errors_fatal 26264 1727204277.00678: checking for max_fail_percentage 26264 1727204277.00679: done checking for max_fail_percentage 26264 1727204277.00680: checking to see if all hosts have failed and the running result is not ok 26264 1727204277.00681: done checking to see if all hosts have failed 26264 1727204277.00682: getting the remaining hosts for this loop 26264 1727204277.00684: done getting the remaining hosts for this loop 26264 1727204277.00688: getting the next task for host managed-node3 26264 1727204277.00693: done getting next task for host managed-node3 26264 1727204277.00698: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 26264 1727204277.00700: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204277.00710: getting variables 26264 1727204277.00712: in VariableManager get_vars() 26264 1727204277.00742: Calling all_inventory to load vars for managed-node3 26264 1727204277.00745: Calling groups_inventory to load vars for managed-node3 26264 1727204277.00747: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204277.00758: Calling all_plugins_play to load vars for managed-node3 26264 1727204277.00761: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204277.00771: Calling groups_plugins_play to load vars for managed-node3 26264 1727204277.01327: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000006e 26264 1727204277.01331: WORKER PROCESS EXITING 26264 1727204277.06815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204277.08473: done with get_vars() 26264 1727204277.08502: done getting variables 26264 1727204277.08553: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.103) 0:00:40.934 ***** 26264 1727204277.08581: entering _queue_task() for managed-node3/debug 26264 1727204277.08911: worker is 1 (out of 1 available) 26264 1727204277.08924: exiting _queue_task() for managed-node3/debug 26264 1727204277.08936: done queuing things up, now waiting for results queue to drain 26264 1727204277.08938: waiting for pending results... 26264 1727204277.09220: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 26264 1727204277.09340: in run() - task 0affcd87-79f5-5ff5-08b0-00000000006f 26264 1727204277.09361: variable 'ansible_search_path' from source: unknown 26264 1727204277.09376: variable 'ansible_search_path' from source: unknown 26264 1727204277.09420: calling self._execute() 26264 1727204277.09541: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204277.09553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204277.09569: variable 'omit' from source: magic vars 26264 1727204277.09983: variable 'ansible_distribution_major_version' from source: facts 26264 1727204277.10001: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204277.10040: variable 'omit' from source: magic vars 26264 1727204277.10085: variable 'omit' from source: magic vars 26264 1727204277.10261: variable 'omit' from source: magic vars 26264 1727204277.10309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204277.10351: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204277.10491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204277.10514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204277.10532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204277.10568: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204277.10675: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204277.10686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204277.10871: Set connection var ansible_pipelining to False 26264 1727204277.10874: Set connection var ansible_connection to ssh 26264 1727204277.10879: Set connection var ansible_shell_type to sh 26264 1727204277.10887: Set connection var ansible_shell_executable to /bin/sh 26264 1727204277.10897: Set connection var ansible_timeout to 10 26264 1727204277.10910: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204277.10942: variable 'ansible_shell_executable' from source: unknown 26264 1727204277.10945: variable 'ansible_connection' from source: unknown 26264 1727204277.10948: variable 'ansible_module_compression' from source: unknown 26264 1727204277.10951: variable 'ansible_shell_type' from source: unknown 26264 1727204277.10954: variable 'ansible_shell_executable' from source: unknown 26264 1727204277.10956: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204277.10962: variable 'ansible_pipelining' from source: unknown 26264 1727204277.10965: variable 'ansible_timeout' from source: unknown 26264 1727204277.10971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204277.11137: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204277.11159: variable 'omit' from source: magic vars 26264 1727204277.11167: starting attempt loop 26264 1727204277.11171: running the handler 26264 1727204277.11219: variable '__network_connections_result' from source: set_fact 26264 1727204277.11322: variable '__network_connections_result' from source: set_fact 26264 1727204277.11443: handler run complete 26264 1727204277.11474: attempt loop complete, returning result 26264 1727204277.11478: _execute() done 26264 1727204277.11485: dumping result to json 26264 1727204277.11490: done dumping result, returning 26264 1727204277.11499: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-5ff5-08b0-00000000006f] 26264 1727204277.11503: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000006f 26264 1727204277.11610: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000006f 26264 1727204277.11612: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 26264 1727204277.11687: no more pending results, returning what we have 26264 1727204277.11691: results queue empty 26264 1727204277.11691: checking for any_errors_fatal 26264 1727204277.11700: done checking for any_errors_fatal 26264 1727204277.11701: checking for max_fail_percentage 26264 1727204277.11703: done checking for max_fail_percentage 26264 1727204277.11703: checking to see if all hosts have failed and the running result is not ok 26264 1727204277.11705: done checking to see if all hosts have failed 26264 1727204277.11705: getting the remaining hosts for this loop 26264 1727204277.11707: done getting the remaining hosts for this loop 26264 1727204277.11711: getting the next task for host managed-node3 26264 1727204277.11717: done getting next task for host managed-node3 26264 1727204277.11721: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 26264 1727204277.11723: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204277.11732: getting variables 26264 1727204277.11734: in VariableManager get_vars() 26264 1727204277.11773: Calling all_inventory to load vars for managed-node3 26264 1727204277.11777: Calling groups_inventory to load vars for managed-node3 26264 1727204277.11779: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204277.11789: Calling all_plugins_play to load vars for managed-node3 26264 1727204277.11791: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204277.11794: Calling groups_plugins_play to load vars for managed-node3 26264 1727204277.13172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204277.14824: done with get_vars() 26264 1727204277.14853: done getting variables 26264 1727204277.14914: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.063) 0:00:40.998 ***** 26264 1727204277.14951: entering _queue_task() for managed-node3/debug 26264 1727204277.15440: worker is 1 (out of 1 available) 26264 1727204277.15452: exiting _queue_task() for managed-node3/debug 26264 1727204277.15468: done queuing things up, now waiting for results queue to drain 26264 1727204277.15470: waiting for pending results... 26264 1727204277.17336: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 26264 1727204277.17560: in run() - task 0affcd87-79f5-5ff5-08b0-000000000070 26264 1727204277.17575: variable 'ansible_search_path' from source: unknown 26264 1727204277.17579: variable 'ansible_search_path' from source: unknown 26264 1727204277.17751: calling self._execute() 26264 1727204277.17954: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204277.17958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204277.17971: variable 'omit' from source: magic vars 26264 1727204277.18936: variable 'ansible_distribution_major_version' from source: facts 26264 1727204277.18941: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204277.19067: variable 'network_state' from source: role '' defaults 26264 1727204277.19079: Evaluated conditional (network_state != {}): False 26264 1727204277.19082: when evaluation is False, skipping this task 26264 1727204277.19085: _execute() done 26264 1727204277.19088: dumping result to json 26264 1727204277.19090: done dumping result, returning 26264 1727204277.19099: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-5ff5-08b0-000000000070] 26264 1727204277.19106: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000070 26264 1727204277.19321: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000070 26264 1727204277.19324: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 26264 1727204277.19379: no more pending results, returning what we have 26264 1727204277.19384: results queue empty 26264 1727204277.19385: checking for any_errors_fatal 26264 1727204277.19394: done checking for any_errors_fatal 26264 1727204277.19395: checking for max_fail_percentage 26264 1727204277.19397: done checking for max_fail_percentage 26264 1727204277.19398: checking to see if all hosts have failed and the running result is not ok 26264 1727204277.19399: done checking to see if all hosts have failed 26264 1727204277.19400: getting the remaining hosts for this loop 26264 1727204277.19401: done getting the remaining hosts for this loop 26264 1727204277.19406: getting the next task for host managed-node3 26264 1727204277.19412: done getting next task for host managed-node3 26264 1727204277.19417: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 26264 1727204277.19419: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204277.19435: getting variables 26264 1727204277.19437: in VariableManager get_vars() 26264 1727204277.19479: Calling all_inventory to load vars for managed-node3 26264 1727204277.19481: Calling groups_inventory to load vars for managed-node3 26264 1727204277.19484: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204277.19496: Calling all_plugins_play to load vars for managed-node3 26264 1727204277.19498: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204277.19501: Calling groups_plugins_play to load vars for managed-node3 26264 1727204277.21895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204277.28099: done with get_vars() 26264 1727204277.28139: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.132) 0:00:41.131 ***** 26264 1727204277.28248: entering _queue_task() for managed-node3/ping 26264 1727204277.28729: worker is 1 (out of 1 available) 26264 1727204277.28743: exiting _queue_task() for managed-node3/ping 26264 1727204277.28755: done queuing things up, now waiting for results queue to drain 26264 1727204277.28757: waiting for pending results... 26264 1727204277.29758: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 26264 1727204277.30045: in run() - task 0affcd87-79f5-5ff5-08b0-000000000071 26264 1727204277.30070: variable 'ansible_search_path' from source: unknown 26264 1727204277.30079: variable 'ansible_search_path' from source: unknown 26264 1727204277.30133: calling self._execute() 26264 1727204277.30232: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204277.30243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204277.30258: variable 'omit' from source: magic vars 26264 1727204277.30967: variable 'ansible_distribution_major_version' from source: facts 26264 1727204277.30978: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204277.30984: variable 'omit' from source: magic vars 26264 1727204277.31044: variable 'omit' from source: magic vars 26264 1727204277.31256: variable 'omit' from source: magic vars 26264 1727204277.31259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204277.31263: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204277.31274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204277.31292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204277.31304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204277.31332: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204277.31336: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204277.31338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204277.31443: Set connection var ansible_pipelining to False 26264 1727204277.31447: Set connection var ansible_connection to ssh 26264 1727204277.31449: Set connection var ansible_shell_type to sh 26264 1727204277.31457: Set connection var ansible_shell_executable to /bin/sh 26264 1727204277.31467: Set connection var ansible_timeout to 10 26264 1727204277.31587: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204277.31612: variable 'ansible_shell_executable' from source: unknown 26264 1727204277.31615: variable 'ansible_connection' from source: unknown 26264 1727204277.31618: variable 'ansible_module_compression' from source: unknown 26264 1727204277.31621: variable 'ansible_shell_type' from source: unknown 26264 1727204277.31623: variable 'ansible_shell_executable' from source: unknown 26264 1727204277.31625: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204277.31629: variable 'ansible_pipelining' from source: unknown 26264 1727204277.31631: variable 'ansible_timeout' from source: unknown 26264 1727204277.31636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204277.32302: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204277.32461: variable 'omit' from source: magic vars 26264 1727204277.32468: starting attempt loop 26264 1727204277.32473: running the handler 26264 1727204277.32487: _low_level_execute_command(): starting 26264 1727204277.32495: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204277.34738: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204277.34747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204277.34886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.34892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204277.34911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204277.34918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.35117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204277.36658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204277.36759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204277.38367: stdout chunk (state=3): >>>/root <<< 26264 1727204277.38546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204277.38550: stdout chunk (state=3): >>><<< 26264 1727204277.38552: stderr chunk (state=3): >>><<< 26264 1727204277.38678: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204277.38683: _low_level_execute_command(): starting 26264 1727204277.38686: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204277.3857918-29471-41052745130461 `" && echo ansible-tmp-1727204277.3857918-29471-41052745130461="` echo /root/.ansible/tmp/ansible-tmp-1727204277.3857918-29471-41052745130461 `" ) && sleep 0' 26264 1727204277.39651: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204277.39655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204277.39693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204277.39697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204277.39707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.39763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204277.39786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204277.39837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204277.41655: stdout chunk (state=3): >>>ansible-tmp-1727204277.3857918-29471-41052745130461=/root/.ansible/tmp/ansible-tmp-1727204277.3857918-29471-41052745130461 <<< 26264 1727204277.41889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204277.41969: stderr chunk (state=3): >>><<< 26264 1727204277.41973: stdout chunk (state=3): >>><<< 26264 1727204277.42176: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204277.3857918-29471-41052745130461=/root/.ansible/tmp/ansible-tmp-1727204277.3857918-29471-41052745130461 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204277.42179: variable 'ansible_module_compression' from source: unknown 26264 1727204277.42182: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 26264 1727204277.42183: variable 'ansible_facts' from source: unknown 26264 1727204277.42233: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204277.3857918-29471-41052745130461/AnsiballZ_ping.py 26264 1727204277.42518: Sending initial data 26264 1727204277.42527: Sent initial data (152 bytes) 26264 1727204277.45221: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204277.45225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204277.45257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.45260: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204277.45262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.45332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204277.45335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204277.45342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204277.45509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204277.47104: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204277.47129: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204277.47170: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpliheamj8 /root/.ansible/tmp/ansible-tmp-1727204277.3857918-29471-41052745130461/AnsiballZ_ping.py <<< 26264 1727204277.47197: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204277.48457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204277.48539: stderr chunk (state=3): >>><<< 26264 1727204277.48543: stdout chunk (state=3): >>><<< 26264 1727204277.48563: done transferring module to remote 26264 1727204277.48575: _low_level_execute_command(): starting 26264 1727204277.48580: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204277.3857918-29471-41052745130461/ /root/.ansible/tmp/ansible-tmp-1727204277.3857918-29471-41052745130461/AnsiballZ_ping.py && sleep 0' 26264 1727204277.49940: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204277.49951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204277.49959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204277.49980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204277.50136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204277.50139: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204277.50142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.50144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204277.50146: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204277.50151: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204277.50154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204277.50156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204277.50158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204277.50160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204277.50162: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204277.50165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.50252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204277.50266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204277.50272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204277.50481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204277.52175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204277.52179: stdout chunk (state=3): >>><<< 26264 1727204277.52185: stderr chunk (state=3): >>><<< 26264 1727204277.52204: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204277.52208: _low_level_execute_command(): starting 26264 1727204277.52210: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204277.3857918-29471-41052745130461/AnsiballZ_ping.py && sleep 0' 26264 1727204277.53325: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204277.54080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204277.54084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204277.54086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204277.54125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204277.54131: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204277.54141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.54155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204277.54165: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204277.54172: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204277.54191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204277.54201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204277.54213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204277.54220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204277.54227: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204277.54236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.54342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204277.54358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204277.54367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204277.54780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204277.67284: stdout chunk (state=3): >>> <<< 26264 1727204277.67289: stdout chunk (state=3): >>>{"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 26264 1727204277.68322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204277.68326: stdout chunk (state=3): >>><<< 26264 1727204277.68331: stderr chunk (state=3): >>><<< 26264 1727204277.68358: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204277.68385: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204277.3857918-29471-41052745130461/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204277.68395: _low_level_execute_command(): starting 26264 1727204277.68400: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204277.3857918-29471-41052745130461/ > /dev/null 2>&1 && sleep 0' 26264 1727204277.69672: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204277.69693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204277.69709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204277.69729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204277.69785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204277.69798: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204277.69812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.69829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204277.69840: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204277.69852: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204277.69866: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204277.69884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204277.69899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204277.69910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204277.69921: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204277.69934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.70015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204277.70032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204277.70047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204277.70129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204277.71906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204277.71993: stderr chunk (state=3): >>><<< 26264 1727204277.71996: stdout chunk (state=3): >>><<< 26264 1727204277.72409: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204277.72413: handler run complete 26264 1727204277.72415: attempt loop complete, returning result 26264 1727204277.72417: _execute() done 26264 1727204277.72422: dumping result to json 26264 1727204277.72424: done dumping result, returning 26264 1727204277.72432: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-5ff5-08b0-000000000071] 26264 1727204277.72434: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000071 26264 1727204277.72508: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000071 26264 1727204277.72512: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 26264 1727204277.72589: no more pending results, returning what we have 26264 1727204277.72593: results queue empty 26264 1727204277.72594: checking for any_errors_fatal 26264 1727204277.72600: done checking for any_errors_fatal 26264 1727204277.72601: checking for max_fail_percentage 26264 1727204277.72602: done checking for max_fail_percentage 26264 1727204277.72603: checking to see if all hosts have failed and the running result is not ok 26264 1727204277.72604: done checking to see if all hosts have failed 26264 1727204277.72605: getting the remaining hosts for this loop 26264 1727204277.72606: done getting the remaining hosts for this loop 26264 1727204277.72610: getting the next task for host managed-node3 26264 1727204277.72616: done getting next task for host managed-node3 26264 1727204277.72618: ^ task is: TASK: meta (role_complete) 26264 1727204277.72620: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204277.72631: getting variables 26264 1727204277.72632: in VariableManager get_vars() 26264 1727204277.72680: Calling all_inventory to load vars for managed-node3 26264 1727204277.72684: Calling groups_inventory to load vars for managed-node3 26264 1727204277.72686: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204277.72696: Calling all_plugins_play to load vars for managed-node3 26264 1727204277.72699: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204277.72702: Calling groups_plugins_play to load vars for managed-node3 26264 1727204277.74253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204277.76216: done with get_vars() 26264 1727204277.76240: done getting variables 26264 1727204277.76332: done queuing things up, now waiting for results queue to drain 26264 1727204277.76335: results queue empty 26264 1727204277.76336: checking for any_errors_fatal 26264 1727204277.76338: done checking for any_errors_fatal 26264 1727204277.76339: checking for max_fail_percentage 26264 1727204277.76340: done checking for max_fail_percentage 26264 1727204277.76341: checking to see if all hosts have failed and the running result is not ok 26264 1727204277.76342: done checking to see if all hosts have failed 26264 1727204277.76343: getting the remaining hosts for this loop 26264 1727204277.76344: done getting the remaining hosts for this loop 26264 1727204277.76347: getting the next task for host managed-node3 26264 1727204277.76357: done getting next task for host managed-node3 26264 1727204277.76359: ^ task is: TASK: meta (flush_handlers) 26264 1727204277.76361: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204277.76366: getting variables 26264 1727204277.76367: in VariableManager get_vars() 26264 1727204277.76380: Calling all_inventory to load vars for managed-node3 26264 1727204277.76383: Calling groups_inventory to load vars for managed-node3 26264 1727204277.76385: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204277.76390: Calling all_plugins_play to load vars for managed-node3 26264 1727204277.76393: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204277.76396: Calling groups_plugins_play to load vars for managed-node3 26264 1727204277.77686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204277.79487: done with get_vars() 26264 1727204277.79522: done getting variables 26264 1727204277.79580: in VariableManager get_vars() 26264 1727204277.79595: Calling all_inventory to load vars for managed-node3 26264 1727204277.79598: Calling groups_inventory to load vars for managed-node3 26264 1727204277.79605: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204277.79614: Calling all_plugins_play to load vars for managed-node3 26264 1727204277.79617: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204277.79621: Calling groups_plugins_play to load vars for managed-node3 26264 1727204277.81090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204277.82872: done with get_vars() 26264 1727204277.82905: done queuing things up, now waiting for results queue to drain 26264 1727204277.82908: results queue empty 26264 1727204277.82909: checking for any_errors_fatal 26264 1727204277.82910: done checking for any_errors_fatal 26264 1727204277.82916: checking for max_fail_percentage 26264 1727204277.82917: done checking for max_fail_percentage 26264 1727204277.82918: checking to see if all hosts have failed and the running result is not ok 26264 1727204277.82919: done checking to see if all hosts have failed 26264 1727204277.82919: getting the remaining hosts for this loop 26264 1727204277.82920: done getting the remaining hosts for this loop 26264 1727204277.82923: getting the next task for host managed-node3 26264 1727204277.82927: done getting next task for host managed-node3 26264 1727204277.82929: ^ task is: TASK: meta (flush_handlers) 26264 1727204277.82931: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204277.82933: getting variables 26264 1727204277.82934: in VariableManager get_vars() 26264 1727204277.82947: Calling all_inventory to load vars for managed-node3 26264 1727204277.82949: Calling groups_inventory to load vars for managed-node3 26264 1727204277.82951: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204277.82961: Calling all_plugins_play to load vars for managed-node3 26264 1727204277.82963: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204277.82970: Calling groups_plugins_play to load vars for managed-node3 26264 1727204277.84250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204277.86010: done with get_vars() 26264 1727204277.86033: done getting variables 26264 1727204277.86095: in VariableManager get_vars() 26264 1727204277.86113: Calling all_inventory to load vars for managed-node3 26264 1727204277.86116: Calling groups_inventory to load vars for managed-node3 26264 1727204277.86118: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204277.86123: Calling all_plugins_play to load vars for managed-node3 26264 1727204277.86125: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204277.86127: Calling groups_plugins_play to load vars for managed-node3 26264 1727204277.87402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204277.88358: done with get_vars() 26264 1727204277.88384: done queuing things up, now waiting for results queue to drain 26264 1727204277.88386: results queue empty 26264 1727204277.88386: checking for any_errors_fatal 26264 1727204277.88387: done checking for any_errors_fatal 26264 1727204277.88388: checking for max_fail_percentage 26264 1727204277.88388: done checking for max_fail_percentage 26264 1727204277.88389: checking to see if all hosts have failed and the running result is not ok 26264 1727204277.88389: done checking to see if all hosts have failed 26264 1727204277.88390: getting the remaining hosts for this loop 26264 1727204277.88390: done getting the remaining hosts for this loop 26264 1727204277.88392: getting the next task for host managed-node3 26264 1727204277.88395: done getting next task for host managed-node3 26264 1727204277.88396: ^ task is: None 26264 1727204277.88397: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204277.88398: done queuing things up, now waiting for results queue to drain 26264 1727204277.88398: results queue empty 26264 1727204277.88399: checking for any_errors_fatal 26264 1727204277.88400: done checking for any_errors_fatal 26264 1727204277.88400: checking for max_fail_percentage 26264 1727204277.88401: done checking for max_fail_percentage 26264 1727204277.88402: checking to see if all hosts have failed and the running result is not ok 26264 1727204277.88402: done checking to see if all hosts have failed 26264 1727204277.88403: getting the next task for host managed-node3 26264 1727204277.88404: done getting next task for host managed-node3 26264 1727204277.88405: ^ task is: None 26264 1727204277.88406: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204277.88441: in VariableManager get_vars() 26264 1727204277.88453: done with get_vars() 26264 1727204277.88458: in VariableManager get_vars() 26264 1727204277.88466: done with get_vars() 26264 1727204277.88469: variable 'omit' from source: magic vars 26264 1727204277.88493: in VariableManager get_vars() 26264 1727204277.88499: done with get_vars() 26264 1727204277.88514: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 26264 1727204277.88637: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 26264 1727204277.88660: getting the remaining hosts for this loop 26264 1727204277.88662: done getting the remaining hosts for this loop 26264 1727204277.88665: getting the next task for host managed-node3 26264 1727204277.88668: done getting next task for host managed-node3 26264 1727204277.88669: ^ task is: TASK: Gathering Facts 26264 1727204277.88670: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204277.88671: getting variables 26264 1727204277.88672: in VariableManager get_vars() 26264 1727204277.88678: Calling all_inventory to load vars for managed-node3 26264 1727204277.88680: Calling groups_inventory to load vars for managed-node3 26264 1727204277.88681: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204277.88685: Calling all_plugins_play to load vars for managed-node3 26264 1727204277.88687: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204277.88688: Calling groups_plugins_play to load vars for managed-node3 26264 1727204277.89403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204277.91153: done with get_vars() 26264 1727204277.91176: done getting variables 26264 1727204277.91232: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.630) 0:00:41.761 ***** 26264 1727204277.91266: entering _queue_task() for managed-node3/gather_facts 26264 1727204277.91655: worker is 1 (out of 1 available) 26264 1727204277.91667: exiting _queue_task() for managed-node3/gather_facts 26264 1727204277.91683: done queuing things up, now waiting for results queue to drain 26264 1727204277.91685: waiting for pending results... 26264 1727204277.91973: running TaskExecutor() for managed-node3/TASK: Gathering Facts 26264 1727204277.92204: in run() - task 0affcd87-79f5-5ff5-08b0-0000000004e4 26264 1727204277.92232: variable 'ansible_search_path' from source: unknown 26264 1727204277.92302: calling self._execute() 26264 1727204277.92408: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204277.92411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204277.92423: variable 'omit' from source: magic vars 26264 1727204277.92718: variable 'ansible_distribution_major_version' from source: facts 26264 1727204277.92728: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204277.92734: variable 'omit' from source: magic vars 26264 1727204277.92762: variable 'omit' from source: magic vars 26264 1727204277.92791: variable 'omit' from source: magic vars 26264 1727204277.92822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204277.92851: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204277.92878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204277.92896: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204277.92904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204277.92928: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204277.92931: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204277.92934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204277.93005: Set connection var ansible_pipelining to False 26264 1727204277.93009: Set connection var ansible_connection to ssh 26264 1727204277.93011: Set connection var ansible_shell_type to sh 26264 1727204277.93016: Set connection var ansible_shell_executable to /bin/sh 26264 1727204277.93023: Set connection var ansible_timeout to 10 26264 1727204277.93029: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204277.93048: variable 'ansible_shell_executable' from source: unknown 26264 1727204277.93050: variable 'ansible_connection' from source: unknown 26264 1727204277.93055: variable 'ansible_module_compression' from source: unknown 26264 1727204277.93058: variable 'ansible_shell_type' from source: unknown 26264 1727204277.93060: variable 'ansible_shell_executable' from source: unknown 26264 1727204277.93062: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204277.93068: variable 'ansible_pipelining' from source: unknown 26264 1727204277.93071: variable 'ansible_timeout' from source: unknown 26264 1727204277.93076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204277.93211: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204277.93225: variable 'omit' from source: magic vars 26264 1727204277.93231: starting attempt loop 26264 1727204277.93234: running the handler 26264 1727204277.93247: variable 'ansible_facts' from source: unknown 26264 1727204277.93263: _low_level_execute_command(): starting 26264 1727204277.93272: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204277.93800: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204277.93817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204277.93829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.93846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204277.93870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.93899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204277.93911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204277.93970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204277.95551: stdout chunk (state=3): >>>/root <<< 26264 1727204277.95683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204277.95709: stderr chunk (state=3): >>><<< 26264 1727204277.95711: stdout chunk (state=3): >>><<< 26264 1727204277.95776: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204277.95780: _low_level_execute_command(): starting 26264 1727204277.95782: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204277.9572947-29508-16624843265105 `" && echo ansible-tmp-1727204277.9572947-29508-16624843265105="` echo /root/.ansible/tmp/ansible-tmp-1727204277.9572947-29508-16624843265105 `" ) && sleep 0' 26264 1727204277.96222: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204277.96225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204277.96269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.96280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204277.96282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204277.96318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204277.96330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204277.96394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204277.98185: stdout chunk (state=3): >>>ansible-tmp-1727204277.9572947-29508-16624843265105=/root/.ansible/tmp/ansible-tmp-1727204277.9572947-29508-16624843265105 <<< 26264 1727204277.98307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204277.98385: stderr chunk (state=3): >>><<< 26264 1727204277.98388: stdout chunk (state=3): >>><<< 26264 1727204277.98413: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204277.9572947-29508-16624843265105=/root/.ansible/tmp/ansible-tmp-1727204277.9572947-29508-16624843265105 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204277.98462: variable 'ansible_module_compression' from source: unknown 26264 1727204277.98540: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 26264 1727204277.98625: variable 'ansible_facts' from source: unknown 26264 1727204277.99130: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204277.9572947-29508-16624843265105/AnsiballZ_setup.py 26264 1727204277.99359: Sending initial data 26264 1727204277.99403: Sent initial data (153 bytes) 26264 1727204278.00329: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204278.00354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204278.00393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204278.00411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204278.00467: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204278.00502: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204278.00708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204278.00915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204278.01019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204278.01042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204278.02668: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204278.02720: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204278.02758: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmplflmb82w /root/.ansible/tmp/ansible-tmp-1727204277.9572947-29508-16624843265105/AnsiballZ_setup.py <<< 26264 1727204278.02801: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204278.05361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204278.05474: stderr chunk (state=3): >>><<< 26264 1727204278.05478: stdout chunk (state=3): >>><<< 26264 1727204278.05500: done transferring module to remote 26264 1727204278.05514: _low_level_execute_command(): starting 26264 1727204278.05517: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204277.9572947-29508-16624843265105/ /root/.ansible/tmp/ansible-tmp-1727204277.9572947-29508-16624843265105/AnsiballZ_setup.py && sleep 0' 26264 1727204278.05986: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204278.05989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204278.05992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204278.06027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204278.06035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204278.06038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204278.06087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204278.06100: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204278.06109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204278.06182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204278.07867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204278.08002: stderr chunk (state=3): >>><<< 26264 1727204278.08015: stdout chunk (state=3): >>><<< 26264 1727204278.08079: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204278.08082: _low_level_execute_command(): starting 26264 1727204278.08084: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204277.9572947-29508-16624843265105/AnsiballZ_setup.py && sleep 0' 26264 1727204278.09021: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204278.09052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204278.09097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204278.09129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204278.09228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204278.09240: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204278.09257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204278.09281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204278.09296: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204278.09320: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204278.09346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204278.09371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204278.09405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204278.09435: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204278.09457: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204278.09485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204278.09597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204278.09617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204278.09635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204278.09734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204278.59217: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2806, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 726, "free": 2806}, "nocache": {"free": 3266, "used": 266}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 624, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264280043520, "block_size": 4096, "block_total": 65519355, "block_available": 64521495, "block_used": 997860, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansi<<< 26264 1727204278.59232: stdout chunk (state=3): >>>ble_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vml<<< 26264 1727204278.59256: stdout chunk (state=3): >>>inuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "58", "epoch": "1727204278", "epoch_int": "1727204278", "date": "2024-09-24", "time": "14:57:58", "iso8601_micro": "2024-09-24T18:57:58.588045Z", "iso8601": "2024-09-24T18:57:58Z", "iso8601_basic": "20240924T145758588045", "iso8601_basic_short": "20240924T145758", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.43, "5m": 0.38, "15m": 0.2}, "ansible_local": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 26264 1727204278.60917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204278.60922: stdout chunk (state=3): >>><<< 26264 1727204278.60924: stderr chunk (state=3): >>><<< 26264 1727204278.61406: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2806, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 726, "free": 2806}, "nocache": {"free": 3266, "used": 266}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 624, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264280043520, "block_size": 4096, "block_total": 65519355, "block_available": 64521495, "block_used": 997860, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "58", "epoch": "1727204278", "epoch_int": "1727204278", "date": "2024-09-24", "time": "14:57:58", "iso8601_micro": "2024-09-24T18:57:58.588045Z", "iso8601": "2024-09-24T18:57:58Z", "iso8601_basic": "20240924T145758588045", "iso8601_basic_short": "20240924T145758", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.43, "5m": 0.38, "15m": 0.2}, "ansible_local": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204278.61417: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204277.9572947-29508-16624843265105/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204278.61419: _low_level_execute_command(): starting 26264 1727204278.61421: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204277.9572947-29508-16624843265105/ > /dev/null 2>&1 && sleep 0' 26264 1727204278.62002: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204278.62015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204278.62031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204278.62054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204278.62097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204278.62107: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204278.62118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204278.62134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204278.62144: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204278.62155: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204278.62169: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204278.62180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204278.62195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204278.62205: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204278.62215: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204278.62229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204278.62306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204278.62325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204278.62339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204278.62416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204278.64280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204278.64284: stdout chunk (state=3): >>><<< 26264 1727204278.64286: stderr chunk (state=3): >>><<< 26264 1727204278.64571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204278.64575: handler run complete 26264 1727204278.64577: variable 'ansible_facts' from source: unknown 26264 1727204278.64579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204278.64901: variable 'ansible_facts' from source: unknown 26264 1727204278.64993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204278.65139: attempt loop complete, returning result 26264 1727204278.65147: _execute() done 26264 1727204278.65157: dumping result to json 26264 1727204278.65192: done dumping result, returning 26264 1727204278.65203: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-5ff5-08b0-0000000004e4] 26264 1727204278.65212: sending task result for task 0affcd87-79f5-5ff5-08b0-0000000004e4 ok: [managed-node3] 26264 1727204278.65840: no more pending results, returning what we have 26264 1727204278.65844: results queue empty 26264 1727204278.65845: checking for any_errors_fatal 26264 1727204278.65846: done checking for any_errors_fatal 26264 1727204278.65847: checking for max_fail_percentage 26264 1727204278.65851: done checking for max_fail_percentage 26264 1727204278.65852: checking to see if all hosts have failed and the running result is not ok 26264 1727204278.65853: done checking to see if all hosts have failed 26264 1727204278.65854: getting the remaining hosts for this loop 26264 1727204278.65856: done getting the remaining hosts for this loop 26264 1727204278.65860: getting the next task for host managed-node3 26264 1727204278.65868: done getting next task for host managed-node3 26264 1727204278.65869: ^ task is: TASK: meta (flush_handlers) 26264 1727204278.65872: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204278.65876: getting variables 26264 1727204278.65878: in VariableManager get_vars() 26264 1727204278.65904: Calling all_inventory to load vars for managed-node3 26264 1727204278.65908: Calling groups_inventory to load vars for managed-node3 26264 1727204278.65911: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204278.65924: Calling all_plugins_play to load vars for managed-node3 26264 1727204278.65926: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204278.65929: Calling groups_plugins_play to load vars for managed-node3 26264 1727204278.66910: done sending task result for task 0affcd87-79f5-5ff5-08b0-0000000004e4 26264 1727204278.66914: WORKER PROCESS EXITING 26264 1727204278.67715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204278.69418: done with get_vars() 26264 1727204278.69452: done getting variables 26264 1727204278.69530: in VariableManager get_vars() 26264 1727204278.69541: Calling all_inventory to load vars for managed-node3 26264 1727204278.69544: Calling groups_inventory to load vars for managed-node3 26264 1727204278.69546: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204278.69554: Calling all_plugins_play to load vars for managed-node3 26264 1727204278.69557: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204278.69570: Calling groups_plugins_play to load vars for managed-node3 26264 1727204278.70940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204278.72627: done with get_vars() 26264 1727204278.72665: done queuing things up, now waiting for results queue to drain 26264 1727204278.72668: results queue empty 26264 1727204278.72669: checking for any_errors_fatal 26264 1727204278.72673: done checking for any_errors_fatal 26264 1727204278.72674: checking for max_fail_percentage 26264 1727204278.72675: done checking for max_fail_percentage 26264 1727204278.72676: checking to see if all hosts have failed and the running result is not ok 26264 1727204278.72677: done checking to see if all hosts have failed 26264 1727204278.72677: getting the remaining hosts for this loop 26264 1727204278.72678: done getting the remaining hosts for this loop 26264 1727204278.72681: getting the next task for host managed-node3 26264 1727204278.72685: done getting next task for host managed-node3 26264 1727204278.72688: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 26264 1727204278.72689: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204278.72691: getting variables 26264 1727204278.72692: in VariableManager get_vars() 26264 1727204278.72702: Calling all_inventory to load vars for managed-node3 26264 1727204278.72704: Calling groups_inventory to load vars for managed-node3 26264 1727204278.72706: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204278.72712: Calling all_plugins_play to load vars for managed-node3 26264 1727204278.72714: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204278.72718: Calling groups_plugins_play to load vars for managed-node3 26264 1727204278.73970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204278.75701: done with get_vars() 26264 1727204278.75723: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 Tuesday 24 September 2024 14:57:58 -0400 (0:00:00.845) 0:00:42.607 ***** 26264 1727204278.75804: entering _queue_task() for managed-node3/include_tasks 26264 1727204278.76204: worker is 1 (out of 1 available) 26264 1727204278.76216: exiting _queue_task() for managed-node3/include_tasks 26264 1727204278.76227: done queuing things up, now waiting for results queue to drain 26264 1727204278.76228: waiting for pending results... 26264 1727204278.76509: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_absent.yml' 26264 1727204278.76631: in run() - task 0affcd87-79f5-5ff5-08b0-000000000074 26264 1727204278.76651: variable 'ansible_search_path' from source: unknown 26264 1727204278.76696: calling self._execute() 26264 1727204278.76797: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204278.76813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204278.76832: variable 'omit' from source: magic vars 26264 1727204278.77237: variable 'ansible_distribution_major_version' from source: facts 26264 1727204278.77258: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204278.77271: _execute() done 26264 1727204278.77279: dumping result to json 26264 1727204278.77285: done dumping result, returning 26264 1727204278.77296: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_absent.yml' [0affcd87-79f5-5ff5-08b0-000000000074] 26264 1727204278.77305: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000074 26264 1727204278.77419: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000074 26264 1727204278.77428: WORKER PROCESS EXITING 26264 1727204278.77459: no more pending results, returning what we have 26264 1727204278.77466: in VariableManager get_vars() 26264 1727204278.77501: Calling all_inventory to load vars for managed-node3 26264 1727204278.77505: Calling groups_inventory to load vars for managed-node3 26264 1727204278.77509: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204278.77524: Calling all_plugins_play to load vars for managed-node3 26264 1727204278.77528: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204278.77532: Calling groups_plugins_play to load vars for managed-node3 26264 1727204278.79269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204278.80969: done with get_vars() 26264 1727204278.80991: variable 'ansible_search_path' from source: unknown 26264 1727204278.81007: we have included files to process 26264 1727204278.81008: generating all_blocks data 26264 1727204278.81009: done generating all_blocks data 26264 1727204278.81010: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 26264 1727204278.81011: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 26264 1727204278.81014: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 26264 1727204278.81185: in VariableManager get_vars() 26264 1727204278.81203: done with get_vars() 26264 1727204278.81322: done processing included file 26264 1727204278.81324: iterating over new_blocks loaded from include file 26264 1727204278.81326: in VariableManager get_vars() 26264 1727204278.81338: done with get_vars() 26264 1727204278.81340: filtering new block on tags 26264 1727204278.81360: done filtering new block on tags 26264 1727204278.81363: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node3 26264 1727204278.81370: extending task lists for all hosts with included blocks 26264 1727204278.81420: done extending task lists 26264 1727204278.81421: done processing included files 26264 1727204278.81422: results queue empty 26264 1727204278.81423: checking for any_errors_fatal 26264 1727204278.81425: done checking for any_errors_fatal 26264 1727204278.81426: checking for max_fail_percentage 26264 1727204278.81427: done checking for max_fail_percentage 26264 1727204278.81427: checking to see if all hosts have failed and the running result is not ok 26264 1727204278.81428: done checking to see if all hosts have failed 26264 1727204278.81429: getting the remaining hosts for this loop 26264 1727204278.81430: done getting the remaining hosts for this loop 26264 1727204278.81433: getting the next task for host managed-node3 26264 1727204278.81437: done getting next task for host managed-node3 26264 1727204278.81439: ^ task is: TASK: Include the task 'get_profile_stat.yml' 26264 1727204278.81441: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204278.81444: getting variables 26264 1727204278.81445: in VariableManager get_vars() 26264 1727204278.81457: Calling all_inventory to load vars for managed-node3 26264 1727204278.81459: Calling groups_inventory to load vars for managed-node3 26264 1727204278.81461: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204278.81468: Calling all_plugins_play to load vars for managed-node3 26264 1727204278.81471: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204278.81474: Calling groups_plugins_play to load vars for managed-node3 26264 1727204278.82821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204278.84484: done with get_vars() 26264 1727204278.84508: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 14:57:58 -0400 (0:00:00.087) 0:00:42.695 ***** 26264 1727204278.84592: entering _queue_task() for managed-node3/include_tasks 26264 1727204278.84939: worker is 1 (out of 1 available) 26264 1727204278.84954: exiting _queue_task() for managed-node3/include_tasks 26264 1727204278.84972: done queuing things up, now waiting for results queue to drain 26264 1727204278.84974: waiting for pending results... 26264 1727204278.85260: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 26264 1727204278.85391: in run() - task 0affcd87-79f5-5ff5-08b0-0000000004f5 26264 1727204278.85411: variable 'ansible_search_path' from source: unknown 26264 1727204278.85422: variable 'ansible_search_path' from source: unknown 26264 1727204278.85467: calling self._execute() 26264 1727204278.85577: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204278.85589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204278.85605: variable 'omit' from source: magic vars 26264 1727204278.86033: variable 'ansible_distribution_major_version' from source: facts 26264 1727204278.86054: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204278.86071: _execute() done 26264 1727204278.86080: dumping result to json 26264 1727204278.86088: done dumping result, returning 26264 1727204278.86098: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-5ff5-08b0-0000000004f5] 26264 1727204278.86111: sending task result for task 0affcd87-79f5-5ff5-08b0-0000000004f5 26264 1727204278.86246: no more pending results, returning what we have 26264 1727204278.86254: in VariableManager get_vars() 26264 1727204278.86294: Calling all_inventory to load vars for managed-node3 26264 1727204278.86297: Calling groups_inventory to load vars for managed-node3 26264 1727204278.86301: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204278.86317: Calling all_plugins_play to load vars for managed-node3 26264 1727204278.86321: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204278.86324: Calling groups_plugins_play to load vars for managed-node3 26264 1727204278.87404: done sending task result for task 0affcd87-79f5-5ff5-08b0-0000000004f5 26264 1727204278.87408: WORKER PROCESS EXITING 26264 1727204278.88125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204278.89896: done with get_vars() 26264 1727204278.89924: variable 'ansible_search_path' from source: unknown 26264 1727204278.89925: variable 'ansible_search_path' from source: unknown 26264 1727204278.89967: we have included files to process 26264 1727204278.89969: generating all_blocks data 26264 1727204278.89970: done generating all_blocks data 26264 1727204278.89972: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 26264 1727204278.89973: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 26264 1727204278.89975: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 26264 1727204278.91098: done processing included file 26264 1727204278.91101: iterating over new_blocks loaded from include file 26264 1727204278.91102: in VariableManager get_vars() 26264 1727204278.91118: done with get_vars() 26264 1727204278.91119: filtering new block on tags 26264 1727204278.91140: done filtering new block on tags 26264 1727204278.91142: in VariableManager get_vars() 26264 1727204278.91154: done with get_vars() 26264 1727204278.91155: filtering new block on tags 26264 1727204278.91175: done filtering new block on tags 26264 1727204278.91177: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 26264 1727204278.91182: extending task lists for all hosts with included blocks 26264 1727204278.91287: done extending task lists 26264 1727204278.91289: done processing included files 26264 1727204278.91289: results queue empty 26264 1727204278.91290: checking for any_errors_fatal 26264 1727204278.91294: done checking for any_errors_fatal 26264 1727204278.91295: checking for max_fail_percentage 26264 1727204278.91296: done checking for max_fail_percentage 26264 1727204278.91296: checking to see if all hosts have failed and the running result is not ok 26264 1727204278.91297: done checking to see if all hosts have failed 26264 1727204278.91298: getting the remaining hosts for this loop 26264 1727204278.91299: done getting the remaining hosts for this loop 26264 1727204278.91301: getting the next task for host managed-node3 26264 1727204278.91306: done getting next task for host managed-node3 26264 1727204278.91308: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 26264 1727204278.91311: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204278.91313: getting variables 26264 1727204278.91314: in VariableManager get_vars() 26264 1727204278.91392: Calling all_inventory to load vars for managed-node3 26264 1727204278.91395: Calling groups_inventory to load vars for managed-node3 26264 1727204278.91398: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204278.91403: Calling all_plugins_play to load vars for managed-node3 26264 1727204278.91406: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204278.91409: Calling groups_plugins_play to load vars for managed-node3 26264 1727204278.92754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204278.94510: done with get_vars() 26264 1727204278.94544: done getting variables 26264 1727204278.94605: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:57:58 -0400 (0:00:00.100) 0:00:42.795 ***** 26264 1727204278.94639: entering _queue_task() for managed-node3/set_fact 26264 1727204278.95030: worker is 1 (out of 1 available) 26264 1727204278.95044: exiting _queue_task() for managed-node3/set_fact 26264 1727204278.95060: done queuing things up, now waiting for results queue to drain 26264 1727204278.95062: waiting for pending results... 26264 1727204278.95387: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 26264 1727204278.95518: in run() - task 0affcd87-79f5-5ff5-08b0-000000000502 26264 1727204278.95538: variable 'ansible_search_path' from source: unknown 26264 1727204278.95546: variable 'ansible_search_path' from source: unknown 26264 1727204278.95598: calling self._execute() 26264 1727204278.95706: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204278.95721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204278.95738: variable 'omit' from source: magic vars 26264 1727204278.96180: variable 'ansible_distribution_major_version' from source: facts 26264 1727204278.96199: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204278.96216: variable 'omit' from source: magic vars 26264 1727204278.96275: variable 'omit' from source: magic vars 26264 1727204278.96321: variable 'omit' from source: magic vars 26264 1727204278.96371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204278.96420: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204278.96452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204278.96480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204278.96502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204278.96541: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204278.96554: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204278.96566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204278.96684: Set connection var ansible_pipelining to False 26264 1727204278.96693: Set connection var ansible_connection to ssh 26264 1727204278.96701: Set connection var ansible_shell_type to sh 26264 1727204278.96715: Set connection var ansible_shell_executable to /bin/sh 26264 1727204278.96728: Set connection var ansible_timeout to 10 26264 1727204278.96740: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204278.96777: variable 'ansible_shell_executable' from source: unknown 26264 1727204278.96787: variable 'ansible_connection' from source: unknown 26264 1727204278.96794: variable 'ansible_module_compression' from source: unknown 26264 1727204278.96801: variable 'ansible_shell_type' from source: unknown 26264 1727204278.96808: variable 'ansible_shell_executable' from source: unknown 26264 1727204278.96819: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204278.96828: variable 'ansible_pipelining' from source: unknown 26264 1727204278.96835: variable 'ansible_timeout' from source: unknown 26264 1727204278.96842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204278.97005: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204278.97021: variable 'omit' from source: magic vars 26264 1727204278.97038: starting attempt loop 26264 1727204278.97045: running the handler 26264 1727204278.97068: handler run complete 26264 1727204278.97088: attempt loop complete, returning result 26264 1727204278.97096: _execute() done 26264 1727204278.97102: dumping result to json 26264 1727204278.97109: done dumping result, returning 26264 1727204278.97120: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-5ff5-08b0-000000000502] 26264 1727204278.97130: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000502 26264 1727204278.97245: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000502 ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 26264 1727204278.97308: no more pending results, returning what we have 26264 1727204278.97312: results queue empty 26264 1727204278.97313: checking for any_errors_fatal 26264 1727204278.97315: done checking for any_errors_fatal 26264 1727204278.97316: checking for max_fail_percentage 26264 1727204278.97317: done checking for max_fail_percentage 26264 1727204278.97318: checking to see if all hosts have failed and the running result is not ok 26264 1727204278.97320: done checking to see if all hosts have failed 26264 1727204278.97321: getting the remaining hosts for this loop 26264 1727204278.97322: done getting the remaining hosts for this loop 26264 1727204278.97326: getting the next task for host managed-node3 26264 1727204278.97334: done getting next task for host managed-node3 26264 1727204278.97338: ^ task is: TASK: Stat profile file 26264 1727204278.97344: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204278.97352: getting variables 26264 1727204278.97354: in VariableManager get_vars() 26264 1727204278.97390: Calling all_inventory to load vars for managed-node3 26264 1727204278.97394: Calling groups_inventory to load vars for managed-node3 26264 1727204278.97397: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204278.97410: Calling all_plugins_play to load vars for managed-node3 26264 1727204278.97414: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204278.97417: Calling groups_plugins_play to load vars for managed-node3 26264 1727204278.98404: WORKER PROCESS EXITING 26264 1727204278.99230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204279.01188: done with get_vars() 26264 1727204279.01213: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:57:59 -0400 (0:00:00.066) 0:00:42.862 ***** 26264 1727204279.01324: entering _queue_task() for managed-node3/stat 26264 1727204279.01668: worker is 1 (out of 1 available) 26264 1727204279.01685: exiting _queue_task() for managed-node3/stat 26264 1727204279.01699: done queuing things up, now waiting for results queue to drain 26264 1727204279.01701: waiting for pending results... 26264 1727204279.02028: running TaskExecutor() for managed-node3/TASK: Stat profile file 26264 1727204279.02182: in run() - task 0affcd87-79f5-5ff5-08b0-000000000503 26264 1727204279.02202: variable 'ansible_search_path' from source: unknown 26264 1727204279.02213: variable 'ansible_search_path' from source: unknown 26264 1727204279.02261: calling self._execute() 26264 1727204279.02374: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204279.02386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204279.02400: variable 'omit' from source: magic vars 26264 1727204279.02823: variable 'ansible_distribution_major_version' from source: facts 26264 1727204279.02844: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204279.02865: variable 'omit' from source: magic vars 26264 1727204279.02920: variable 'omit' from source: magic vars 26264 1727204279.03034: variable 'profile' from source: include params 26264 1727204279.03044: variable 'interface' from source: set_fact 26264 1727204279.03130: variable 'interface' from source: set_fact 26264 1727204279.03158: variable 'omit' from source: magic vars 26264 1727204279.03213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204279.03261: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204279.03294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204279.03319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204279.03338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204279.03380: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204279.03390: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204279.03403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204279.03516: Set connection var ansible_pipelining to False 26264 1727204279.03525: Set connection var ansible_connection to ssh 26264 1727204279.03532: Set connection var ansible_shell_type to sh 26264 1727204279.03546: Set connection var ansible_shell_executable to /bin/sh 26264 1727204279.03566: Set connection var ansible_timeout to 10 26264 1727204279.03582: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204279.03612: variable 'ansible_shell_executable' from source: unknown 26264 1727204279.03623: variable 'ansible_connection' from source: unknown 26264 1727204279.03631: variable 'ansible_module_compression' from source: unknown 26264 1727204279.03639: variable 'ansible_shell_type' from source: unknown 26264 1727204279.03645: variable 'ansible_shell_executable' from source: unknown 26264 1727204279.03655: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204279.03665: variable 'ansible_pipelining' from source: unknown 26264 1727204279.03676: variable 'ansible_timeout' from source: unknown 26264 1727204279.03684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204279.03912: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204279.03929: variable 'omit' from source: magic vars 26264 1727204279.03946: starting attempt loop 26264 1727204279.03957: running the handler 26264 1727204279.03977: _low_level_execute_command(): starting 26264 1727204279.03989: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204279.04831: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204279.04847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.04868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.04890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.04936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.04954: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204279.04971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.04993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204279.05006: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204279.05017: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204279.05028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.05041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.05066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.05080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.05095: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204279.05112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.05199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204279.05218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204279.05233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204279.05383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204279.06956: stdout chunk (state=3): >>>/root <<< 26264 1727204279.07145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204279.07151: stdout chunk (state=3): >>><<< 26264 1727204279.07153: stderr chunk (state=3): >>><<< 26264 1727204279.07264: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204279.07267: _low_level_execute_command(): starting 26264 1727204279.07270: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204279.0717993-29537-21687487735763 `" && echo ansible-tmp-1727204279.0717993-29537-21687487735763="` echo /root/.ansible/tmp/ansible-tmp-1727204279.0717993-29537-21687487735763 `" ) && sleep 0' 26264 1727204279.07934: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204279.07959: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.07980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.08000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.08047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.08073: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204279.08089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.08108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204279.08122: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204279.08133: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204279.08146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.08174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.08192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.08206: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.08218: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204279.08232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.08320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204279.08344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204279.08368: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204279.08450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204279.10282: stdout chunk (state=3): >>>ansible-tmp-1727204279.0717993-29537-21687487735763=/root/.ansible/tmp/ansible-tmp-1727204279.0717993-29537-21687487735763 <<< 26264 1727204279.10379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204279.10484: stderr chunk (state=3): >>><<< 26264 1727204279.10494: stdout chunk (state=3): >>><<< 26264 1727204279.10769: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204279.0717993-29537-21687487735763=/root/.ansible/tmp/ansible-tmp-1727204279.0717993-29537-21687487735763 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204279.10773: variable 'ansible_module_compression' from source: unknown 26264 1727204279.10775: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 26264 1727204279.10777: variable 'ansible_facts' from source: unknown 26264 1727204279.10779: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204279.0717993-29537-21687487735763/AnsiballZ_stat.py 26264 1727204279.10932: Sending initial data 26264 1727204279.10935: Sent initial data (152 bytes) 26264 1727204279.12116: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204279.12142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.12169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.12189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.12257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.12273: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204279.12287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.12306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204279.12318: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204279.12343: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204279.12360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.12383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.12406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.12419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.12432: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204279.12461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.12558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204279.12597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204279.12619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204279.12709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204279.14362: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204279.14404: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204279.14455: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpfvn7t320 /root/.ansible/tmp/ansible-tmp-1727204279.0717993-29537-21687487735763/AnsiballZ_stat.py <<< 26264 1727204279.14496: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204279.15579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204279.15776: stderr chunk (state=3): >>><<< 26264 1727204279.15779: stdout chunk (state=3): >>><<< 26264 1727204279.15782: done transferring module to remote 26264 1727204279.15784: _low_level_execute_command(): starting 26264 1727204279.15790: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204279.0717993-29537-21687487735763/ /root/.ansible/tmp/ansible-tmp-1727204279.0717993-29537-21687487735763/AnsiballZ_stat.py && sleep 0' 26264 1727204279.16331: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204279.16346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.16371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.16390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.16435: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.16450: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204279.16473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.16492: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204279.16505: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204279.16516: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204279.16552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.16571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.16593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.16606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.16617: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204279.16630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.16739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204279.16760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204279.16778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204279.16869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204279.18528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204279.18612: stderr chunk (state=3): >>><<< 26264 1727204279.18615: stdout chunk (state=3): >>><<< 26264 1727204279.18707: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204279.18711: _low_level_execute_command(): starting 26264 1727204279.18713: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204279.0717993-29537-21687487735763/AnsiballZ_stat.py && sleep 0' 26264 1727204279.19291: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204279.19308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.19323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.19342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.19389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.19406: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204279.19422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.19440: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204279.19456: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204279.19470: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204279.19483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.19499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.19519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.19532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.19543: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204279.19562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.19647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204279.19671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204279.19687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204279.19775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204279.32654: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 26264 1727204279.33689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204279.33693: stdout chunk (state=3): >>><<< 26264 1727204279.33695: stderr chunk (state=3): >>><<< 26264 1727204279.33836: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204279.33849: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204279.0717993-29537-21687487735763/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204279.33853: _low_level_execute_command(): starting 26264 1727204279.33855: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204279.0717993-29537-21687487735763/ > /dev/null 2>&1 && sleep 0' 26264 1727204279.34488: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204279.34509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.34529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.34546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.34599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.34612: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204279.34631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.34651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204279.34663: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204279.34675: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204279.34705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.34720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.34741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.34756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.34770: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204279.34787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.34878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204279.34910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204279.34931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204279.35005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204279.36745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204279.36788: stderr chunk (state=3): >>><<< 26264 1727204279.36792: stdout chunk (state=3): >>><<< 26264 1727204279.36806: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204279.36815: handler run complete 26264 1727204279.36830: attempt loop complete, returning result 26264 1727204279.36834: _execute() done 26264 1727204279.36837: dumping result to json 26264 1727204279.36839: done dumping result, returning 26264 1727204279.36848: done running TaskExecutor() for managed-node3/TASK: Stat profile file [0affcd87-79f5-5ff5-08b0-000000000503] 26264 1727204279.36856: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000503 26264 1727204279.36951: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000503 26264 1727204279.36954: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 26264 1727204279.37013: no more pending results, returning what we have 26264 1727204279.37017: results queue empty 26264 1727204279.37018: checking for any_errors_fatal 26264 1727204279.37024: done checking for any_errors_fatal 26264 1727204279.37025: checking for max_fail_percentage 26264 1727204279.37027: done checking for max_fail_percentage 26264 1727204279.37027: checking to see if all hosts have failed and the running result is not ok 26264 1727204279.37029: done checking to see if all hosts have failed 26264 1727204279.37029: getting the remaining hosts for this loop 26264 1727204279.37031: done getting the remaining hosts for this loop 26264 1727204279.37035: getting the next task for host managed-node3 26264 1727204279.37042: done getting next task for host managed-node3 26264 1727204279.37044: ^ task is: TASK: Set NM profile exist flag based on the profile files 26264 1727204279.37048: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204279.37053: getting variables 26264 1727204279.37054: in VariableManager get_vars() 26264 1727204279.37087: Calling all_inventory to load vars for managed-node3 26264 1727204279.37090: Calling groups_inventory to load vars for managed-node3 26264 1727204279.37093: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204279.37105: Calling all_plugins_play to load vars for managed-node3 26264 1727204279.37107: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204279.37109: Calling groups_plugins_play to load vars for managed-node3 26264 1727204279.38012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204279.39517: done with get_vars() 26264 1727204279.39535: done getting variables 26264 1727204279.39584: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:57:59 -0400 (0:00:00.382) 0:00:43.245 ***** 26264 1727204279.39613: entering _queue_task() for managed-node3/set_fact 26264 1727204279.39841: worker is 1 (out of 1 available) 26264 1727204279.39857: exiting _queue_task() for managed-node3/set_fact 26264 1727204279.39871: done queuing things up, now waiting for results queue to drain 26264 1727204279.39873: waiting for pending results... 26264 1727204279.40054: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 26264 1727204279.40123: in run() - task 0affcd87-79f5-5ff5-08b0-000000000504 26264 1727204279.40135: variable 'ansible_search_path' from source: unknown 26264 1727204279.40139: variable 'ansible_search_path' from source: unknown 26264 1727204279.40172: calling self._execute() 26264 1727204279.40242: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204279.40246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204279.40256: variable 'omit' from source: magic vars 26264 1727204279.40552: variable 'ansible_distribution_major_version' from source: facts 26264 1727204279.40562: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204279.40652: variable 'profile_stat' from source: set_fact 26264 1727204279.40661: Evaluated conditional (profile_stat.stat.exists): False 26264 1727204279.40666: when evaluation is False, skipping this task 26264 1727204279.40670: _execute() done 26264 1727204279.40672: dumping result to json 26264 1727204279.40675: done dumping result, returning 26264 1727204279.40681: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-5ff5-08b0-000000000504] 26264 1727204279.40687: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000504 26264 1727204279.40774: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000504 26264 1727204279.40777: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 26264 1727204279.40834: no more pending results, returning what we have 26264 1727204279.40840: results queue empty 26264 1727204279.40840: checking for any_errors_fatal 26264 1727204279.40850: done checking for any_errors_fatal 26264 1727204279.40851: checking for max_fail_percentage 26264 1727204279.40853: done checking for max_fail_percentage 26264 1727204279.40854: checking to see if all hosts have failed and the running result is not ok 26264 1727204279.40855: done checking to see if all hosts have failed 26264 1727204279.40855: getting the remaining hosts for this loop 26264 1727204279.40857: done getting the remaining hosts for this loop 26264 1727204279.40861: getting the next task for host managed-node3 26264 1727204279.40869: done getting next task for host managed-node3 26264 1727204279.40872: ^ task is: TASK: Get NM profile info 26264 1727204279.40876: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204279.40879: getting variables 26264 1727204279.40881: in VariableManager get_vars() 26264 1727204279.40906: Calling all_inventory to load vars for managed-node3 26264 1727204279.40908: Calling groups_inventory to load vars for managed-node3 26264 1727204279.40915: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204279.40928: Calling all_plugins_play to load vars for managed-node3 26264 1727204279.40931: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204279.40934: Calling groups_plugins_play to load vars for managed-node3 26264 1727204279.44848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204279.45772: done with get_vars() 26264 1727204279.45792: done getting variables 26264 1727204279.45848: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:57:59 -0400 (0:00:00.062) 0:00:43.307 ***** 26264 1727204279.45869: entering _queue_task() for managed-node3/shell 26264 1727204279.45870: Creating lock for shell 26264 1727204279.46109: worker is 1 (out of 1 available) 26264 1727204279.46122: exiting _queue_task() for managed-node3/shell 26264 1727204279.46135: done queuing things up, now waiting for results queue to drain 26264 1727204279.46137: waiting for pending results... 26264 1727204279.46321: running TaskExecutor() for managed-node3/TASK: Get NM profile info 26264 1727204279.46413: in run() - task 0affcd87-79f5-5ff5-08b0-000000000505 26264 1727204279.46424: variable 'ansible_search_path' from source: unknown 26264 1727204279.46428: variable 'ansible_search_path' from source: unknown 26264 1727204279.46461: calling self._execute() 26264 1727204279.46532: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204279.46538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204279.46550: variable 'omit' from source: magic vars 26264 1727204279.46845: variable 'ansible_distribution_major_version' from source: facts 26264 1727204279.46858: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204279.46865: variable 'omit' from source: magic vars 26264 1727204279.46903: variable 'omit' from source: magic vars 26264 1727204279.46984: variable 'profile' from source: include params 26264 1727204279.46989: variable 'interface' from source: set_fact 26264 1727204279.47036: variable 'interface' from source: set_fact 26264 1727204279.47050: variable 'omit' from source: magic vars 26264 1727204279.47087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204279.47118: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204279.47134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204279.47147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204279.47160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204279.47184: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204279.47187: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204279.47190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204279.47263: Set connection var ansible_pipelining to False 26264 1727204279.47267: Set connection var ansible_connection to ssh 26264 1727204279.47270: Set connection var ansible_shell_type to sh 26264 1727204279.47274: Set connection var ansible_shell_executable to /bin/sh 26264 1727204279.47281: Set connection var ansible_timeout to 10 26264 1727204279.47287: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204279.47306: variable 'ansible_shell_executable' from source: unknown 26264 1727204279.47310: variable 'ansible_connection' from source: unknown 26264 1727204279.47312: variable 'ansible_module_compression' from source: unknown 26264 1727204279.47315: variable 'ansible_shell_type' from source: unknown 26264 1727204279.47319: variable 'ansible_shell_executable' from source: unknown 26264 1727204279.47321: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204279.47324: variable 'ansible_pipelining' from source: unknown 26264 1727204279.47327: variable 'ansible_timeout' from source: unknown 26264 1727204279.47330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204279.47486: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204279.47502: variable 'omit' from source: magic vars 26264 1727204279.47512: starting attempt loop 26264 1727204279.47519: running the handler 26264 1727204279.47532: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204279.47565: _low_level_execute_command(): starting 26264 1727204279.47583: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204279.48416: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204279.48445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.48466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.48488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.48532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.48558: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204279.48577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.48597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204279.48611: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204279.48623: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204279.48635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.48656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.48688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.48703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.48716: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204279.48731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.48820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204279.48838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204279.48853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204279.48942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204279.50471: stdout chunk (state=3): >>>/root <<< 26264 1727204279.50576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204279.50628: stderr chunk (state=3): >>><<< 26264 1727204279.50632: stdout chunk (state=3): >>><<< 26264 1727204279.50655: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204279.50668: _low_level_execute_command(): starting 26264 1727204279.50674: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204279.5065343-29555-33094130925057 `" && echo ansible-tmp-1727204279.5065343-29555-33094130925057="` echo /root/.ansible/tmp/ansible-tmp-1727204279.5065343-29555-33094130925057 `" ) && sleep 0' 26264 1727204279.51123: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.51127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.51169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.51175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.51177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.51228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204279.51231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204279.51280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204279.53087: stdout chunk (state=3): >>>ansible-tmp-1727204279.5065343-29555-33094130925057=/root/.ansible/tmp/ansible-tmp-1727204279.5065343-29555-33094130925057 <<< 26264 1727204279.53208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204279.53261: stderr chunk (state=3): >>><<< 26264 1727204279.53268: stdout chunk (state=3): >>><<< 26264 1727204279.53287: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204279.5065343-29555-33094130925057=/root/.ansible/tmp/ansible-tmp-1727204279.5065343-29555-33094130925057 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204279.53314: variable 'ansible_module_compression' from source: unknown 26264 1727204279.53363: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 26264 1727204279.53395: variable 'ansible_facts' from source: unknown 26264 1727204279.53447: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204279.5065343-29555-33094130925057/AnsiballZ_command.py 26264 1727204279.53555: Sending initial data 26264 1727204279.53566: Sent initial data (155 bytes) 26264 1727204279.54256: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.54259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.54305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204279.54308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.54311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204279.54313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.54365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204279.54370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204279.54376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204279.54419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204279.56074: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204279.56107: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204279.56152: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmp_q5oyhyl /root/.ansible/tmp/ansible-tmp-1727204279.5065343-29555-33094130925057/AnsiballZ_command.py <<< 26264 1727204279.56189: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204279.56975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204279.57074: stderr chunk (state=3): >>><<< 26264 1727204279.57077: stdout chunk (state=3): >>><<< 26264 1727204279.57094: done transferring module to remote 26264 1727204279.57106: _low_level_execute_command(): starting 26264 1727204279.57110: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204279.5065343-29555-33094130925057/ /root/.ansible/tmp/ansible-tmp-1727204279.5065343-29555-33094130925057/AnsiballZ_command.py && sleep 0' 26264 1727204279.57576: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.57589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.57609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204279.57625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.57681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204279.57696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204279.57735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204279.59396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204279.59439: stderr chunk (state=3): >>><<< 26264 1727204279.59442: stdout chunk (state=3): >>><<< 26264 1727204279.59458: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204279.59461: _low_level_execute_command(): starting 26264 1727204279.59466: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204279.5065343-29555-33094130925057/AnsiballZ_command.py && sleep 0' 26264 1727204279.59926: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.59944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.59955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.59969: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.59979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.60024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204279.60038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204279.60093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204279.74843: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-24 14:57:59.730353", "end": "2024-09-24 14:57:59.747099", "delta": "0:00:00.016746", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 26264 1727204279.76003: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.87 closed. <<< 26264 1727204279.76007: stdout chunk (state=3): >>><<< 26264 1727204279.76012: stderr chunk (state=3): >>><<< 26264 1727204279.76039: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-24 14:57:59.730353", "end": "2024-09-24 14:57:59.747099", "delta": "0:00:00.016746", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.87 closed. 26264 1727204279.76080: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204279.5065343-29555-33094130925057/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204279.76089: _low_level_execute_command(): starting 26264 1727204279.76093: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204279.5065343-29555-33094130925057/ > /dev/null 2>&1 && sleep 0' 26264 1727204279.77095: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204279.77106: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.77115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.77128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.77181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.77187: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204279.77197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.77209: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204279.77216: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204279.77223: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204279.77230: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204279.77239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204279.77249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204279.77262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204279.77280: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204279.77289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204279.77363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204279.77378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204279.77400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204279.77585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204279.79345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204279.79348: stdout chunk (state=3): >>><<< 26264 1727204279.79363: stderr chunk (state=3): >>><<< 26264 1727204279.79407: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204279.79413: handler run complete 26264 1727204279.79437: Evaluated conditional (False): False 26264 1727204279.79447: attempt loop complete, returning result 26264 1727204279.79450: _execute() done 26264 1727204279.79455: dumping result to json 26264 1727204279.79461: done dumping result, returning 26264 1727204279.79473: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [0affcd87-79f5-5ff5-08b0-000000000505] 26264 1727204279.79477: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000505 26264 1727204279.79585: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000505 26264 1727204279.79588: WORKER PROCESS EXITING fatal: [managed-node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "delta": "0:00:00.016746", "end": "2024-09-24 14:57:59.747099", "rc": 1, "start": "2024-09-24 14:57:59.730353" } MSG: non-zero return code ...ignoring 26264 1727204279.79658: no more pending results, returning what we have 26264 1727204279.79661: results queue empty 26264 1727204279.79662: checking for any_errors_fatal 26264 1727204279.79672: done checking for any_errors_fatal 26264 1727204279.79672: checking for max_fail_percentage 26264 1727204279.79674: done checking for max_fail_percentage 26264 1727204279.79675: checking to see if all hosts have failed and the running result is not ok 26264 1727204279.79676: done checking to see if all hosts have failed 26264 1727204279.79677: getting the remaining hosts for this loop 26264 1727204279.79679: done getting the remaining hosts for this loop 26264 1727204279.79683: getting the next task for host managed-node3 26264 1727204279.79690: done getting next task for host managed-node3 26264 1727204279.79692: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 26264 1727204279.79696: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204279.79701: getting variables 26264 1727204279.79703: in VariableManager get_vars() 26264 1727204279.79734: Calling all_inventory to load vars for managed-node3 26264 1727204279.79736: Calling groups_inventory to load vars for managed-node3 26264 1727204279.79740: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204279.79751: Calling all_plugins_play to load vars for managed-node3 26264 1727204279.79754: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204279.79756: Calling groups_plugins_play to load vars for managed-node3 26264 1727204279.82454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204279.84677: done with get_vars() 26264 1727204279.84704: done getting variables 26264 1727204279.84770: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:57:59 -0400 (0:00:00.389) 0:00:43.697 ***** 26264 1727204279.84803: entering _queue_task() for managed-node3/set_fact 26264 1727204279.85118: worker is 1 (out of 1 available) 26264 1727204279.85131: exiting _queue_task() for managed-node3/set_fact 26264 1727204279.85143: done queuing things up, now waiting for results queue to drain 26264 1727204279.85145: waiting for pending results... 26264 1727204279.86597: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 26264 1727204279.86893: in run() - task 0affcd87-79f5-5ff5-08b0-000000000506 26264 1727204279.86913: variable 'ansible_search_path' from source: unknown 26264 1727204279.86942: variable 'ansible_search_path' from source: unknown 26264 1727204279.87018: calling self._execute() 26264 1727204279.87342: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204279.87357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204279.87387: variable 'omit' from source: magic vars 26264 1727204279.88319: variable 'ansible_distribution_major_version' from source: facts 26264 1727204279.88385: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204279.88802: variable 'nm_profile_exists' from source: set_fact 26264 1727204279.88824: Evaluated conditional (nm_profile_exists.rc == 0): False 26264 1727204279.88835: when evaluation is False, skipping this task 26264 1727204279.88841: _execute() done 26264 1727204279.88847: dumping result to json 26264 1727204279.88857: done dumping result, returning 26264 1727204279.88870: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-5ff5-08b0-000000000506] 26264 1727204279.88881: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000506 26264 1727204279.89032: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000506 26264 1727204279.89040: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 26264 1727204279.89215: no more pending results, returning what we have 26264 1727204279.89220: results queue empty 26264 1727204279.89221: checking for any_errors_fatal 26264 1727204279.89233: done checking for any_errors_fatal 26264 1727204279.89234: checking for max_fail_percentage 26264 1727204279.89236: done checking for max_fail_percentage 26264 1727204279.89237: checking to see if all hosts have failed and the running result is not ok 26264 1727204279.89238: done checking to see if all hosts have failed 26264 1727204279.89238: getting the remaining hosts for this loop 26264 1727204279.89240: done getting the remaining hosts for this loop 26264 1727204279.89244: getting the next task for host managed-node3 26264 1727204279.89255: done getting next task for host managed-node3 26264 1727204279.89259: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 26264 1727204279.89263: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204279.89286: getting variables 26264 1727204279.89288: in VariableManager get_vars() 26264 1727204279.89318: Calling all_inventory to load vars for managed-node3 26264 1727204279.89321: Calling groups_inventory to load vars for managed-node3 26264 1727204279.89325: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204279.89337: Calling all_plugins_play to load vars for managed-node3 26264 1727204279.89340: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204279.89344: Calling groups_plugins_play to load vars for managed-node3 26264 1727204279.93299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204279.97198: done with get_vars() 26264 1727204279.97350: done getting variables 26264 1727204279.97531: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 26264 1727204279.97792: variable 'profile' from source: include params 26264 1727204279.97797: variable 'interface' from source: set_fact 26264 1727204279.97986: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-lsr27] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:57:59 -0400 (0:00:00.132) 0:00:43.829 ***** 26264 1727204279.98019: entering _queue_task() for managed-node3/command 26264 1727204279.98832: worker is 1 (out of 1 available) 26264 1727204279.98852: exiting _queue_task() for managed-node3/command 26264 1727204279.98868: done queuing things up, now waiting for results queue to drain 26264 1727204279.98870: waiting for pending results... 26264 1727204279.99535: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-lsr27 26264 1727204279.99691: in run() - task 0affcd87-79f5-5ff5-08b0-000000000508 26264 1727204279.99918: variable 'ansible_search_path' from source: unknown 26264 1727204279.99928: variable 'ansible_search_path' from source: unknown 26264 1727204279.99974: calling self._execute() 26264 1727204280.00185: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204280.00284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204280.00304: variable 'omit' from source: magic vars 26264 1727204280.01446: variable 'ansible_distribution_major_version' from source: facts 26264 1727204280.01469: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204280.01597: variable 'profile_stat' from source: set_fact 26264 1727204280.01617: Evaluated conditional (profile_stat.stat.exists): False 26264 1727204280.01624: when evaluation is False, skipping this task 26264 1727204280.01633: _execute() done 26264 1727204280.01640: dumping result to json 26264 1727204280.01648: done dumping result, returning 26264 1727204280.01660: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-lsr27 [0affcd87-79f5-5ff5-08b0-000000000508] 26264 1727204280.01674: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000508 26264 1727204280.01789: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000508 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 26264 1727204280.01846: no more pending results, returning what we have 26264 1727204280.01853: results queue empty 26264 1727204280.01854: checking for any_errors_fatal 26264 1727204280.01861: done checking for any_errors_fatal 26264 1727204280.01862: checking for max_fail_percentage 26264 1727204280.01866: done checking for max_fail_percentage 26264 1727204280.01867: checking to see if all hosts have failed and the running result is not ok 26264 1727204280.01868: done checking to see if all hosts have failed 26264 1727204280.01869: getting the remaining hosts for this loop 26264 1727204280.01870: done getting the remaining hosts for this loop 26264 1727204280.01874: getting the next task for host managed-node3 26264 1727204280.01882: done getting next task for host managed-node3 26264 1727204280.01886: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 26264 1727204280.01890: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204280.01895: getting variables 26264 1727204280.01897: in VariableManager get_vars() 26264 1727204280.01927: Calling all_inventory to load vars for managed-node3 26264 1727204280.01930: Calling groups_inventory to load vars for managed-node3 26264 1727204280.01934: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204280.01952: Calling all_plugins_play to load vars for managed-node3 26264 1727204280.01955: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204280.01959: Calling groups_plugins_play to load vars for managed-node3 26264 1727204280.02620: WORKER PROCESS EXITING 26264 1727204280.05766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204280.10039: done with get_vars() 26264 1727204280.10068: done getting variables 26264 1727204280.10246: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 26264 1727204280.10483: variable 'profile' from source: include params 26264 1727204280.10487: variable 'interface' from source: set_fact 26264 1727204280.10670: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-lsr27] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:58:00 -0400 (0:00:00.126) 0:00:43.956 ***** 26264 1727204280.10702: entering _queue_task() for managed-node3/set_fact 26264 1727204280.11507: worker is 1 (out of 1 available) 26264 1727204280.11519: exiting _queue_task() for managed-node3/set_fact 26264 1727204280.11531: done queuing things up, now waiting for results queue to drain 26264 1727204280.11533: waiting for pending results... 26264 1727204280.12247: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-lsr27 26264 1727204280.12353: in run() - task 0affcd87-79f5-5ff5-08b0-000000000509 26264 1727204280.12368: variable 'ansible_search_path' from source: unknown 26264 1727204280.12373: variable 'ansible_search_path' from source: unknown 26264 1727204280.12408: calling self._execute() 26264 1727204280.12511: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204280.12518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204280.12531: variable 'omit' from source: magic vars 26264 1727204280.13629: variable 'ansible_distribution_major_version' from source: facts 26264 1727204280.13638: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204280.13867: variable 'profile_stat' from source: set_fact 26264 1727204280.13880: Evaluated conditional (profile_stat.stat.exists): False 26264 1727204280.13883: when evaluation is False, skipping this task 26264 1727204280.13885: _execute() done 26264 1727204280.13888: dumping result to json 26264 1727204280.13891: done dumping result, returning 26264 1727204280.13900: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-lsr27 [0affcd87-79f5-5ff5-08b0-000000000509] 26264 1727204280.13906: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000509 26264 1727204280.14005: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000509 26264 1727204280.14009: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 26264 1727204280.14084: no more pending results, returning what we have 26264 1727204280.14088: results queue empty 26264 1727204280.14089: checking for any_errors_fatal 26264 1727204280.14096: done checking for any_errors_fatal 26264 1727204280.14097: checking for max_fail_percentage 26264 1727204280.14099: done checking for max_fail_percentage 26264 1727204280.14100: checking to see if all hosts have failed and the running result is not ok 26264 1727204280.14101: done checking to see if all hosts have failed 26264 1727204280.14102: getting the remaining hosts for this loop 26264 1727204280.14103: done getting the remaining hosts for this loop 26264 1727204280.14108: getting the next task for host managed-node3 26264 1727204280.14116: done getting next task for host managed-node3 26264 1727204280.14120: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 26264 1727204280.14124: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204280.14131: getting variables 26264 1727204280.14133: in VariableManager get_vars() 26264 1727204280.14167: Calling all_inventory to load vars for managed-node3 26264 1727204280.14171: Calling groups_inventory to load vars for managed-node3 26264 1727204280.14175: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204280.14189: Calling all_plugins_play to load vars for managed-node3 26264 1727204280.14192: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204280.14195: Calling groups_plugins_play to load vars for managed-node3 26264 1727204280.16907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204280.20810: done with get_vars() 26264 1727204280.20850: done getting variables 26264 1727204280.21218: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 26264 1727204280.21338: variable 'profile' from source: include params 26264 1727204280.21343: variable 'interface' from source: set_fact 26264 1727204280.21611: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-lsr27] ****************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:58:00 -0400 (0:00:00.109) 0:00:44.065 ***** 26264 1727204280.21646: entering _queue_task() for managed-node3/command 26264 1727204280.22403: worker is 1 (out of 1 available) 26264 1727204280.22413: exiting _queue_task() for managed-node3/command 26264 1727204280.22424: done queuing things up, now waiting for results queue to drain 26264 1727204280.22426: waiting for pending results... 26264 1727204280.23163: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-lsr27 26264 1727204280.23584: in run() - task 0affcd87-79f5-5ff5-08b0-00000000050a 26264 1727204280.23605: variable 'ansible_search_path' from source: unknown 26264 1727204280.23613: variable 'ansible_search_path' from source: unknown 26264 1727204280.23674: calling self._execute() 26264 1727204280.23845: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204280.23977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204280.23993: variable 'omit' from source: magic vars 26264 1727204280.24848: variable 'ansible_distribution_major_version' from source: facts 26264 1727204280.24867: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204280.25108: variable 'profile_stat' from source: set_fact 26264 1727204280.25129: Evaluated conditional (profile_stat.stat.exists): False 26264 1727204280.25166: when evaluation is False, skipping this task 26264 1727204280.25184: _execute() done 26264 1727204280.25193: dumping result to json 26264 1727204280.25275: done dumping result, returning 26264 1727204280.25295: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-lsr27 [0affcd87-79f5-5ff5-08b0-00000000050a] 26264 1727204280.25307: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000050a skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 26264 1727204280.25469: no more pending results, returning what we have 26264 1727204280.25473: results queue empty 26264 1727204280.25474: checking for any_errors_fatal 26264 1727204280.25481: done checking for any_errors_fatal 26264 1727204280.25482: checking for max_fail_percentage 26264 1727204280.25484: done checking for max_fail_percentage 26264 1727204280.25485: checking to see if all hosts have failed and the running result is not ok 26264 1727204280.25486: done checking to see if all hosts have failed 26264 1727204280.25486: getting the remaining hosts for this loop 26264 1727204280.25488: done getting the remaining hosts for this loop 26264 1727204280.25492: getting the next task for host managed-node3 26264 1727204280.25500: done getting next task for host managed-node3 26264 1727204280.25502: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 26264 1727204280.25506: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204280.25510: getting variables 26264 1727204280.25512: in VariableManager get_vars() 26264 1727204280.25542: Calling all_inventory to load vars for managed-node3 26264 1727204280.25545: Calling groups_inventory to load vars for managed-node3 26264 1727204280.25552: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204280.25570: Calling all_plugins_play to load vars for managed-node3 26264 1727204280.25574: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204280.25578: Calling groups_plugins_play to load vars for managed-node3 26264 1727204280.26097: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000050a 26264 1727204280.26100: WORKER PROCESS EXITING 26264 1727204280.28438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204280.32513: done with get_vars() 26264 1727204280.32553: done getting variables 26264 1727204280.32631: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 26264 1727204280.32954: variable 'profile' from source: include params 26264 1727204280.32958: variable 'interface' from source: set_fact 26264 1727204280.33135: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-lsr27] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:58:00 -0400 (0:00:00.115) 0:00:44.180 ***** 26264 1727204280.33174: entering _queue_task() for managed-node3/set_fact 26264 1727204280.33909: worker is 1 (out of 1 available) 26264 1727204280.34019: exiting _queue_task() for managed-node3/set_fact 26264 1727204280.34031: done queuing things up, now waiting for results queue to drain 26264 1727204280.34033: waiting for pending results... 26264 1727204280.34949: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-lsr27 26264 1727204280.35098: in run() - task 0affcd87-79f5-5ff5-08b0-00000000050b 26264 1727204280.35117: variable 'ansible_search_path' from source: unknown 26264 1727204280.35132: variable 'ansible_search_path' from source: unknown 26264 1727204280.35196: calling self._execute() 26264 1727204280.35318: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204280.35330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204280.35344: variable 'omit' from source: magic vars 26264 1727204280.36869: variable 'ansible_distribution_major_version' from source: facts 26264 1727204280.37027: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204280.38213: variable 'profile_stat' from source: set_fact 26264 1727204280.38237: Evaluated conditional (profile_stat.stat.exists): False 26264 1727204280.38245: when evaluation is False, skipping this task 26264 1727204280.38252: _execute() done 26264 1727204280.38260: dumping result to json 26264 1727204280.38269: done dumping result, returning 26264 1727204280.38281: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-lsr27 [0affcd87-79f5-5ff5-08b0-00000000050b] 26264 1727204280.38291: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000050b skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 26264 1727204280.38446: no more pending results, returning what we have 26264 1727204280.38453: results queue empty 26264 1727204280.38454: checking for any_errors_fatal 26264 1727204280.38461: done checking for any_errors_fatal 26264 1727204280.38462: checking for max_fail_percentage 26264 1727204280.38465: done checking for max_fail_percentage 26264 1727204280.38466: checking to see if all hosts have failed and the running result is not ok 26264 1727204280.38467: done checking to see if all hosts have failed 26264 1727204280.38467: getting the remaining hosts for this loop 26264 1727204280.38469: done getting the remaining hosts for this loop 26264 1727204280.38473: getting the next task for host managed-node3 26264 1727204280.38482: done getting next task for host managed-node3 26264 1727204280.38485: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 26264 1727204280.38488: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204280.38493: getting variables 26264 1727204280.38495: in VariableManager get_vars() 26264 1727204280.38525: Calling all_inventory to load vars for managed-node3 26264 1727204280.38527: Calling groups_inventory to load vars for managed-node3 26264 1727204280.38531: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204280.38549: Calling all_plugins_play to load vars for managed-node3 26264 1727204280.38552: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204280.38556: Calling groups_plugins_play to load vars for managed-node3 26264 1727204280.39544: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000050b 26264 1727204280.39548: WORKER PROCESS EXITING 26264 1727204280.41668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204280.44065: done with get_vars() 26264 1727204280.44211: done getting variables 26264 1727204280.44285: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 26264 1727204280.44544: variable 'profile' from source: include params 26264 1727204280.44551: variable 'interface' from source: set_fact 26264 1727204280.44624: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'lsr27'] ***************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 14:58:00 -0400 (0:00:00.114) 0:00:44.295 ***** 26264 1727204280.44667: entering _queue_task() for managed-node3/assert 26264 1727204280.45153: worker is 1 (out of 1 available) 26264 1727204280.45278: exiting _queue_task() for managed-node3/assert 26264 1727204280.45295: done queuing things up, now waiting for results queue to drain 26264 1727204280.45297: waiting for pending results... 26264 1727204280.46304: running TaskExecutor() for managed-node3/TASK: Assert that the profile is absent - 'lsr27' 26264 1727204280.46502: in run() - task 0affcd87-79f5-5ff5-08b0-0000000004f6 26264 1727204280.46507: variable 'ansible_search_path' from source: unknown 26264 1727204280.46509: variable 'ansible_search_path' from source: unknown 26264 1727204280.46560: calling self._execute() 26264 1727204280.46653: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204280.46657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204280.46882: variable 'omit' from source: magic vars 26264 1727204280.47056: variable 'ansible_distribution_major_version' from source: facts 26264 1727204280.47072: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204280.47076: variable 'omit' from source: magic vars 26264 1727204280.47115: variable 'omit' from source: magic vars 26264 1727204280.47211: variable 'profile' from source: include params 26264 1727204280.47215: variable 'interface' from source: set_fact 26264 1727204280.47282: variable 'interface' from source: set_fact 26264 1727204280.47299: variable 'omit' from source: magic vars 26264 1727204280.47339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204280.47494: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204280.47513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204280.47531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204280.47541: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204280.47689: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204280.47693: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204280.47695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204280.48001: Set connection var ansible_pipelining to False 26264 1727204280.48008: Set connection var ansible_connection to ssh 26264 1727204280.48011: Set connection var ansible_shell_type to sh 26264 1727204280.48019: Set connection var ansible_shell_executable to /bin/sh 26264 1727204280.48027: Set connection var ansible_timeout to 10 26264 1727204280.48078: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204280.48778: variable 'ansible_shell_executable' from source: unknown 26264 1727204280.48782: variable 'ansible_connection' from source: unknown 26264 1727204280.48784: variable 'ansible_module_compression' from source: unknown 26264 1727204280.48786: variable 'ansible_shell_type' from source: unknown 26264 1727204280.48789: variable 'ansible_shell_executable' from source: unknown 26264 1727204280.48791: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204280.48794: variable 'ansible_pipelining' from source: unknown 26264 1727204280.48797: variable 'ansible_timeout' from source: unknown 26264 1727204280.48800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204280.49773: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204280.49786: variable 'omit' from source: magic vars 26264 1727204280.49933: starting attempt loop 26264 1727204280.49936: running the handler 26264 1727204280.50578: variable 'lsr_net_profile_exists' from source: set_fact 26264 1727204280.50584: Evaluated conditional (not lsr_net_profile_exists): True 26264 1727204280.50590: handler run complete 26264 1727204280.50605: attempt loop complete, returning result 26264 1727204280.50608: _execute() done 26264 1727204280.50611: dumping result to json 26264 1727204280.50614: done dumping result, returning 26264 1727204280.50644: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is absent - 'lsr27' [0affcd87-79f5-5ff5-08b0-0000000004f6] 26264 1727204280.50647: sending task result for task 0affcd87-79f5-5ff5-08b0-0000000004f6 26264 1727204280.50768: done sending task result for task 0affcd87-79f5-5ff5-08b0-0000000004f6 26264 1727204280.50772: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 26264 1727204280.50831: no more pending results, returning what we have 26264 1727204280.50837: results queue empty 26264 1727204280.50838: checking for any_errors_fatal 26264 1727204280.50843: done checking for any_errors_fatal 26264 1727204280.50844: checking for max_fail_percentage 26264 1727204280.50846: done checking for max_fail_percentage 26264 1727204280.50850: checking to see if all hosts have failed and the running result is not ok 26264 1727204280.50851: done checking to see if all hosts have failed 26264 1727204280.50852: getting the remaining hosts for this loop 26264 1727204280.50853: done getting the remaining hosts for this loop 26264 1727204280.50858: getting the next task for host managed-node3 26264 1727204280.50868: done getting next task for host managed-node3 26264 1727204280.50872: ^ task is: TASK: Include the task 'assert_device_absent.yml' 26264 1727204280.50875: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204280.50880: getting variables 26264 1727204280.50881: in VariableManager get_vars() 26264 1727204280.50913: Calling all_inventory to load vars for managed-node3 26264 1727204280.50917: Calling groups_inventory to load vars for managed-node3 26264 1727204280.50921: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204280.50935: Calling all_plugins_play to load vars for managed-node3 26264 1727204280.50938: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204280.50941: Calling groups_plugins_play to load vars for managed-node3 26264 1727204280.55179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204280.59530: done with get_vars() 26264 1727204280.59569: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 Tuesday 24 September 2024 14:58:00 -0400 (0:00:00.152) 0:00:44.447 ***** 26264 1727204280.59877: entering _queue_task() for managed-node3/include_tasks 26264 1727204280.60287: worker is 1 (out of 1 available) 26264 1727204280.60300: exiting _queue_task() for managed-node3/include_tasks 26264 1727204280.60318: done queuing things up, now waiting for results queue to drain 26264 1727204280.60320: waiting for pending results... 26264 1727204280.60647: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_absent.yml' 26264 1727204280.60971: in run() - task 0affcd87-79f5-5ff5-08b0-000000000075 26264 1727204280.60989: variable 'ansible_search_path' from source: unknown 26264 1727204280.61033: calling self._execute() 26264 1727204280.61194: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204280.61199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204280.61201: variable 'omit' from source: magic vars 26264 1727204280.61772: variable 'ansible_distribution_major_version' from source: facts 26264 1727204280.61791: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204280.61795: _execute() done 26264 1727204280.61800: dumping result to json 26264 1727204280.61803: done dumping result, returning 26264 1727204280.61811: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_absent.yml' [0affcd87-79f5-5ff5-08b0-000000000075] 26264 1727204280.61816: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000075 26264 1727204280.61918: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000075 26264 1727204280.61921: WORKER PROCESS EXITING 26264 1727204280.61969: no more pending results, returning what we have 26264 1727204280.61974: in VariableManager get_vars() 26264 1727204280.62008: Calling all_inventory to load vars for managed-node3 26264 1727204280.62011: Calling groups_inventory to load vars for managed-node3 26264 1727204280.62014: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204280.62029: Calling all_plugins_play to load vars for managed-node3 26264 1727204280.62032: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204280.62035: Calling groups_plugins_play to load vars for managed-node3 26264 1727204280.63617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204280.65501: done with get_vars() 26264 1727204280.65534: variable 'ansible_search_path' from source: unknown 26264 1727204280.65558: we have included files to process 26264 1727204280.65559: generating all_blocks data 26264 1727204280.65567: done generating all_blocks data 26264 1727204280.65575: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 26264 1727204280.65577: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 26264 1727204280.65580: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 26264 1727204280.65763: in VariableManager get_vars() 26264 1727204280.65783: done with get_vars() 26264 1727204280.65936: done processing included file 26264 1727204280.65939: iterating over new_blocks loaded from include file 26264 1727204280.65941: in VariableManager get_vars() 26264 1727204280.65953: done with get_vars() 26264 1727204280.65954: filtering new block on tags 26264 1727204280.65974: done filtering new block on tags 26264 1727204280.65976: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node3 26264 1727204280.65980: extending task lists for all hosts with included blocks 26264 1727204280.66156: done extending task lists 26264 1727204280.66157: done processing included files 26264 1727204280.66158: results queue empty 26264 1727204280.66159: checking for any_errors_fatal 26264 1727204280.66162: done checking for any_errors_fatal 26264 1727204280.66163: checking for max_fail_percentage 26264 1727204280.66166: done checking for max_fail_percentage 26264 1727204280.66167: checking to see if all hosts have failed and the running result is not ok 26264 1727204280.66168: done checking to see if all hosts have failed 26264 1727204280.66169: getting the remaining hosts for this loop 26264 1727204280.66170: done getting the remaining hosts for this loop 26264 1727204280.66173: getting the next task for host managed-node3 26264 1727204280.66177: done getting next task for host managed-node3 26264 1727204280.66179: ^ task is: TASK: Include the task 'get_interface_stat.yml' 26264 1727204280.66182: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204280.66184: getting variables 26264 1727204280.66185: in VariableManager get_vars() 26264 1727204280.66194: Calling all_inventory to load vars for managed-node3 26264 1727204280.66196: Calling groups_inventory to load vars for managed-node3 26264 1727204280.66199: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204280.66204: Calling all_plugins_play to load vars for managed-node3 26264 1727204280.66206: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204280.66209: Calling groups_plugins_play to load vars for managed-node3 26264 1727204280.67635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204280.69317: done with get_vars() 26264 1727204280.69338: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:58:00 -0400 (0:00:00.095) 0:00:44.543 ***** 26264 1727204280.69414: entering _queue_task() for managed-node3/include_tasks 26264 1727204280.69749: worker is 1 (out of 1 available) 26264 1727204280.69761: exiting _queue_task() for managed-node3/include_tasks 26264 1727204280.69775: done queuing things up, now waiting for results queue to drain 26264 1727204280.69777: waiting for pending results... 26264 1727204280.70058: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 26264 1727204280.70162: in run() - task 0affcd87-79f5-5ff5-08b0-00000000053c 26264 1727204280.70207: variable 'ansible_search_path' from source: unknown 26264 1727204280.70211: variable 'ansible_search_path' from source: unknown 26264 1727204280.70251: calling self._execute() 26264 1727204280.70365: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204280.70387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204280.70406: variable 'omit' from source: magic vars 26264 1727204280.70921: variable 'ansible_distribution_major_version' from source: facts 26264 1727204280.70925: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204280.70928: _execute() done 26264 1727204280.70930: dumping result to json 26264 1727204280.70932: done dumping result, returning 26264 1727204280.70935: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-5ff5-08b0-00000000053c] 26264 1727204280.70942: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000053c 26264 1727204280.71045: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000053c 26264 1727204280.71050: WORKER PROCESS EXITING 26264 1727204280.71092: no more pending results, returning what we have 26264 1727204280.71096: in VariableManager get_vars() 26264 1727204280.71130: Calling all_inventory to load vars for managed-node3 26264 1727204280.71133: Calling groups_inventory to load vars for managed-node3 26264 1727204280.71137: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204280.71335: Calling all_plugins_play to load vars for managed-node3 26264 1727204280.71338: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204280.71346: Calling groups_plugins_play to load vars for managed-node3 26264 1727204280.72914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204280.74511: done with get_vars() 26264 1727204280.74530: variable 'ansible_search_path' from source: unknown 26264 1727204280.74531: variable 'ansible_search_path' from source: unknown 26264 1727204280.74580: we have included files to process 26264 1727204280.74581: generating all_blocks data 26264 1727204280.74583: done generating all_blocks data 26264 1727204280.74584: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 26264 1727204280.74585: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 26264 1727204280.74588: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 26264 1727204280.74786: done processing included file 26264 1727204280.74787: iterating over new_blocks loaded from include file 26264 1727204280.74788: in VariableManager get_vars() 26264 1727204280.74797: done with get_vars() 26264 1727204280.74798: filtering new block on tags 26264 1727204280.74812: done filtering new block on tags 26264 1727204280.74816: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 26264 1727204280.74823: extending task lists for all hosts with included blocks 26264 1727204280.74931: done extending task lists 26264 1727204280.74932: done processing included files 26264 1727204280.74933: results queue empty 26264 1727204280.74933: checking for any_errors_fatal 26264 1727204280.74936: done checking for any_errors_fatal 26264 1727204280.74936: checking for max_fail_percentage 26264 1727204280.74937: done checking for max_fail_percentage 26264 1727204280.74938: checking to see if all hosts have failed and the running result is not ok 26264 1727204280.74938: done checking to see if all hosts have failed 26264 1727204280.74939: getting the remaining hosts for this loop 26264 1727204280.74940: done getting the remaining hosts for this loop 26264 1727204280.74941: getting the next task for host managed-node3 26264 1727204280.74945: done getting next task for host managed-node3 26264 1727204280.74947: ^ task is: TASK: Get stat for interface {{ interface }} 26264 1727204280.74950: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204280.74952: getting variables 26264 1727204280.74953: in VariableManager get_vars() 26264 1727204280.74959: Calling all_inventory to load vars for managed-node3 26264 1727204280.74961: Calling groups_inventory to load vars for managed-node3 26264 1727204280.74962: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204280.74967: Calling all_plugins_play to load vars for managed-node3 26264 1727204280.74969: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204280.74971: Calling groups_plugins_play to load vars for managed-node3 26264 1727204280.75846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204280.77039: done with get_vars() 26264 1727204280.77068: done getting variables 26264 1727204280.77207: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:58:00 -0400 (0:00:00.078) 0:00:44.621 ***** 26264 1727204280.77235: entering _queue_task() for managed-node3/stat 26264 1727204280.77552: worker is 1 (out of 1 available) 26264 1727204280.77566: exiting _queue_task() for managed-node3/stat 26264 1727204280.77578: done queuing things up, now waiting for results queue to drain 26264 1727204280.77580: waiting for pending results... 26264 1727204280.77789: running TaskExecutor() for managed-node3/TASK: Get stat for interface lsr27 26264 1727204280.77911: in run() - task 0affcd87-79f5-5ff5-08b0-000000000554 26264 1727204280.77932: variable 'ansible_search_path' from source: unknown 26264 1727204280.77935: variable 'ansible_search_path' from source: unknown 26264 1727204280.77967: calling self._execute() 26264 1727204280.78040: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204280.78043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204280.78051: variable 'omit' from source: magic vars 26264 1727204280.78354: variable 'ansible_distribution_major_version' from source: facts 26264 1727204280.78367: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204280.78374: variable 'omit' from source: magic vars 26264 1727204280.78410: variable 'omit' from source: magic vars 26264 1727204280.78485: variable 'interface' from source: set_fact 26264 1727204280.78498: variable 'omit' from source: magic vars 26264 1727204280.78531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204280.78572: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204280.78587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204280.78602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204280.78612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204280.78635: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204280.78638: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204280.78641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204280.78715: Set connection var ansible_pipelining to False 26264 1727204280.78772: Set connection var ansible_connection to ssh 26264 1727204280.78775: Set connection var ansible_shell_type to sh 26264 1727204280.78778: Set connection var ansible_shell_executable to /bin/sh 26264 1727204280.78780: Set connection var ansible_timeout to 10 26264 1727204280.78784: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204280.78787: variable 'ansible_shell_executable' from source: unknown 26264 1727204280.78789: variable 'ansible_connection' from source: unknown 26264 1727204280.78792: variable 'ansible_module_compression' from source: unknown 26264 1727204280.78795: variable 'ansible_shell_type' from source: unknown 26264 1727204280.78797: variable 'ansible_shell_executable' from source: unknown 26264 1727204280.78799: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204280.78802: variable 'ansible_pipelining' from source: unknown 26264 1727204280.78804: variable 'ansible_timeout' from source: unknown 26264 1727204280.78806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204280.78992: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 26264 1727204280.79001: variable 'omit' from source: magic vars 26264 1727204280.79011: starting attempt loop 26264 1727204280.79022: running the handler 26264 1727204280.79038: _low_level_execute_command(): starting 26264 1727204280.79050: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204280.79719: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204280.79733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204280.79758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204280.79774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204280.79785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204280.79835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204280.79854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204280.79867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204280.79929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204280.81545: stdout chunk (state=3): >>>/root <<< 26264 1727204280.81687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204280.81818: stderr chunk (state=3): >>><<< 26264 1727204280.81866: stdout chunk (state=3): >>><<< 26264 1727204280.81992: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204280.81997: _low_level_execute_command(): starting 26264 1727204280.81999: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204280.818926-29666-19235208420624 `" && echo ansible-tmp-1727204280.818926-29666-19235208420624="` echo /root/.ansible/tmp/ansible-tmp-1727204280.818926-29666-19235208420624 `" ) && sleep 0' 26264 1727204280.82573: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204280.82589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204280.82612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204280.82631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204280.82674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204280.82703: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204280.82717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204280.82733: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204280.82742: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204280.82761: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204280.82782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204280.82796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204280.82812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204280.82829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204280.82842: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204280.82854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204280.82934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204280.82950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204280.82969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204280.83046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204280.84861: stdout chunk (state=3): >>>ansible-tmp-1727204280.818926-29666-19235208420624=/root/.ansible/tmp/ansible-tmp-1727204280.818926-29666-19235208420624 <<< 26264 1727204280.84981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204280.85034: stderr chunk (state=3): >>><<< 26264 1727204280.85038: stdout chunk (state=3): >>><<< 26264 1727204280.85054: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204280.818926-29666-19235208420624=/root/.ansible/tmp/ansible-tmp-1727204280.818926-29666-19235208420624 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204280.85099: variable 'ansible_module_compression' from source: unknown 26264 1727204280.85145: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 26264 1727204280.85180: variable 'ansible_facts' from source: unknown 26264 1727204280.85237: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204280.818926-29666-19235208420624/AnsiballZ_stat.py 26264 1727204280.85381: Sending initial data 26264 1727204280.85393: Sent initial data (151 bytes) 26264 1727204280.86087: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204280.86090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204280.86093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204280.86131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204280.86134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204280.86136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204280.86138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204280.86187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204280.86195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204280.86204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204280.86261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204280.87943: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204280.88012: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 26264 1727204280.88015: stderr chunk (state=3): >>>debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204280.88061: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpm_rqqwre /root/.ansible/tmp/ansible-tmp-1727204280.818926-29666-19235208420624/AnsiballZ_stat.py <<< 26264 1727204280.88096: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204280.88889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204280.88992: stderr chunk (state=3): >>><<< 26264 1727204280.88995: stdout chunk (state=3): >>><<< 26264 1727204280.89011: done transferring module to remote 26264 1727204280.89020: _low_level_execute_command(): starting 26264 1727204280.89025: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204280.818926-29666-19235208420624/ /root/.ansible/tmp/ansible-tmp-1727204280.818926-29666-19235208420624/AnsiballZ_stat.py && sleep 0' 26264 1727204280.89484: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204280.89488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204280.89525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204280.89528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204280.89534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204280.89583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204280.89594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204280.89644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204280.91314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204280.91389: stderr chunk (state=3): >>><<< 26264 1727204280.91393: stdout chunk (state=3): >>><<< 26264 1727204280.91409: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204280.91412: _low_level_execute_command(): starting 26264 1727204280.91417: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204280.818926-29666-19235208420624/AnsiballZ_stat.py && sleep 0' 26264 1727204280.92080: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204280.92089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204280.92099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204280.92112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204280.92159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204280.92167: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204280.92178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204280.92191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204280.92198: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204280.92204: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204280.92212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204280.92221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204280.92233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204280.92247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204280.92257: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204280.92269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204280.92339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204280.92366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204280.92377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204280.92451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204281.05298: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 26264 1727204281.06411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204281.06425: stdout chunk (state=3): >>><<< 26264 1727204281.06430: stderr chunk (state=3): >>><<< 26264 1727204281.06481: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204281.06487: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204280.818926-29666-19235208420624/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204281.06490: _low_level_execute_command(): starting 26264 1727204281.06492: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204280.818926-29666-19235208420624/ > /dev/null 2>&1 && sleep 0' 26264 1727204281.07578: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204281.07582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204281.07602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204281.07635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204281.07672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204281.07706: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204281.07709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204281.07742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204281.07761: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204281.07787: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204281.07801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204281.07803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204281.07806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204281.07808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204281.07844: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204281.07847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204281.07991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204281.07994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204281.08001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204281.08115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204281.09868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204281.09938: stderr chunk (state=3): >>><<< 26264 1727204281.09941: stdout chunk (state=3): >>><<< 26264 1727204281.09961: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204281.09969: handler run complete 26264 1727204281.09991: attempt loop complete, returning result 26264 1727204281.09994: _execute() done 26264 1727204281.09996: dumping result to json 26264 1727204281.09998: done dumping result, returning 26264 1727204281.10007: done running TaskExecutor() for managed-node3/TASK: Get stat for interface lsr27 [0affcd87-79f5-5ff5-08b0-000000000554] 26264 1727204281.10012: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000554 26264 1727204281.10112: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000554 26264 1727204281.10115: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 26264 1727204281.10170: no more pending results, returning what we have 26264 1727204281.10174: results queue empty 26264 1727204281.10175: checking for any_errors_fatal 26264 1727204281.10177: done checking for any_errors_fatal 26264 1727204281.10177: checking for max_fail_percentage 26264 1727204281.10179: done checking for max_fail_percentage 26264 1727204281.10180: checking to see if all hosts have failed and the running result is not ok 26264 1727204281.10181: done checking to see if all hosts have failed 26264 1727204281.10182: getting the remaining hosts for this loop 26264 1727204281.10184: done getting the remaining hosts for this loop 26264 1727204281.10188: getting the next task for host managed-node3 26264 1727204281.10195: done getting next task for host managed-node3 26264 1727204281.10197: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 26264 1727204281.10200: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204281.10205: getting variables 26264 1727204281.10206: in VariableManager get_vars() 26264 1727204281.10236: Calling all_inventory to load vars for managed-node3 26264 1727204281.10239: Calling groups_inventory to load vars for managed-node3 26264 1727204281.10243: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204281.10254: Calling all_plugins_play to load vars for managed-node3 26264 1727204281.10257: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204281.10259: Calling groups_plugins_play to load vars for managed-node3 26264 1727204281.14278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204281.18871: done with get_vars() 26264 1727204281.18909: done getting variables 26264 1727204281.19093: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 26264 1727204281.19246: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'lsr27'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.420) 0:00:45.042 ***** 26264 1727204281.19285: entering _queue_task() for managed-node3/assert 26264 1727204281.19636: worker is 1 (out of 1 available) 26264 1727204281.19648: exiting _queue_task() for managed-node3/assert 26264 1727204281.19661: done queuing things up, now waiting for results queue to drain 26264 1727204281.19663: waiting for pending results... 26264 1727204281.19980: running TaskExecutor() for managed-node3/TASK: Assert that the interface is absent - 'lsr27' 26264 1727204281.20110: in run() - task 0affcd87-79f5-5ff5-08b0-00000000053d 26264 1727204281.20135: variable 'ansible_search_path' from source: unknown 26264 1727204281.20143: variable 'ansible_search_path' from source: unknown 26264 1727204281.20185: calling self._execute() 26264 1727204281.20292: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204281.20304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204281.20321: variable 'omit' from source: magic vars 26264 1727204281.20712: variable 'ansible_distribution_major_version' from source: facts 26264 1727204281.20730: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204281.20742: variable 'omit' from source: magic vars 26264 1727204281.20795: variable 'omit' from source: magic vars 26264 1727204281.20937: variable 'interface' from source: set_fact 26264 1727204281.20960: variable 'omit' from source: magic vars 26264 1727204281.21044: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204281.21089: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204281.21137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204281.21160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204281.21181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204281.21216: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204281.21272: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204281.21281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204281.21430: Set connection var ansible_pipelining to False 26264 1727204281.21439: Set connection var ansible_connection to ssh 26264 1727204281.21446: Set connection var ansible_shell_type to sh 26264 1727204281.21470: Set connection var ansible_shell_executable to /bin/sh 26264 1727204281.21490: Set connection var ansible_timeout to 10 26264 1727204281.21503: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204281.21532: variable 'ansible_shell_executable' from source: unknown 26264 1727204281.21541: variable 'ansible_connection' from source: unknown 26264 1727204281.21549: variable 'ansible_module_compression' from source: unknown 26264 1727204281.21556: variable 'ansible_shell_type' from source: unknown 26264 1727204281.21563: variable 'ansible_shell_executable' from source: unknown 26264 1727204281.21573: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204281.21581: variable 'ansible_pipelining' from source: unknown 26264 1727204281.21595: variable 'ansible_timeout' from source: unknown 26264 1727204281.21602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204281.21754: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204281.21779: variable 'omit' from source: magic vars 26264 1727204281.21791: starting attempt loop 26264 1727204281.21800: running the handler 26264 1727204281.21958: variable 'interface_stat' from source: set_fact 26264 1727204281.21976: Evaluated conditional (not interface_stat.stat.exists): True 26264 1727204281.21985: handler run complete 26264 1727204281.22005: attempt loop complete, returning result 26264 1727204281.22012: _execute() done 26264 1727204281.22020: dumping result to json 26264 1727204281.22033: done dumping result, returning 26264 1727204281.22045: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is absent - 'lsr27' [0affcd87-79f5-5ff5-08b0-00000000053d] 26264 1727204281.22056: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000053d ok: [managed-node3] => { "changed": false } MSG: All assertions passed 26264 1727204281.22204: no more pending results, returning what we have 26264 1727204281.22209: results queue empty 26264 1727204281.22210: checking for any_errors_fatal 26264 1727204281.22219: done checking for any_errors_fatal 26264 1727204281.22220: checking for max_fail_percentage 26264 1727204281.22222: done checking for max_fail_percentage 26264 1727204281.22223: checking to see if all hosts have failed and the running result is not ok 26264 1727204281.22224: done checking to see if all hosts have failed 26264 1727204281.22225: getting the remaining hosts for this loop 26264 1727204281.22227: done getting the remaining hosts for this loop 26264 1727204281.22232: getting the next task for host managed-node3 26264 1727204281.22242: done getting next task for host managed-node3 26264 1727204281.22245: ^ task is: TASK: meta (flush_handlers) 26264 1727204281.22248: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204281.22252: getting variables 26264 1727204281.22254: in VariableManager get_vars() 26264 1727204281.22288: Calling all_inventory to load vars for managed-node3 26264 1727204281.22291: Calling groups_inventory to load vars for managed-node3 26264 1727204281.22295: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204281.22307: Calling all_plugins_play to load vars for managed-node3 26264 1727204281.22311: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204281.22314: Calling groups_plugins_play to load vars for managed-node3 26264 1727204281.23738: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000053d 26264 1727204281.23741: WORKER PROCESS EXITING 26264 1727204281.24651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204281.26901: done with get_vars() 26264 1727204281.26930: done getting variables 26264 1727204281.27116: in VariableManager get_vars() 26264 1727204281.27128: Calling all_inventory to load vars for managed-node3 26264 1727204281.27130: Calling groups_inventory to load vars for managed-node3 26264 1727204281.27133: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204281.27138: Calling all_plugins_play to load vars for managed-node3 26264 1727204281.27140: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204281.27143: Calling groups_plugins_play to load vars for managed-node3 26264 1727204281.28762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204281.30834: done with get_vars() 26264 1727204281.30871: done queuing things up, now waiting for results queue to drain 26264 1727204281.30874: results queue empty 26264 1727204281.30875: checking for any_errors_fatal 26264 1727204281.30878: done checking for any_errors_fatal 26264 1727204281.30879: checking for max_fail_percentage 26264 1727204281.30880: done checking for max_fail_percentage 26264 1727204281.30880: checking to see if all hosts have failed and the running result is not ok 26264 1727204281.30881: done checking to see if all hosts have failed 26264 1727204281.30887: getting the remaining hosts for this loop 26264 1727204281.30888: done getting the remaining hosts for this loop 26264 1727204281.31070: getting the next task for host managed-node3 26264 1727204281.31075: done getting next task for host managed-node3 26264 1727204281.31077: ^ task is: TASK: meta (flush_handlers) 26264 1727204281.31079: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204281.31081: getting variables 26264 1727204281.31082: in VariableManager get_vars() 26264 1727204281.31094: Calling all_inventory to load vars for managed-node3 26264 1727204281.31096: Calling groups_inventory to load vars for managed-node3 26264 1727204281.31098: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204281.31104: Calling all_plugins_play to load vars for managed-node3 26264 1727204281.31106: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204281.31109: Calling groups_plugins_play to load vars for managed-node3 26264 1727204281.33182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204281.35052: done with get_vars() 26264 1727204281.35077: done getting variables 26264 1727204281.35143: in VariableManager get_vars() 26264 1727204281.35153: Calling all_inventory to load vars for managed-node3 26264 1727204281.35156: Calling groups_inventory to load vars for managed-node3 26264 1727204281.35158: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204281.35165: Calling all_plugins_play to load vars for managed-node3 26264 1727204281.35168: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204281.35171: Calling groups_plugins_play to load vars for managed-node3 26264 1727204281.36538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204281.38374: done with get_vars() 26264 1727204281.38404: done queuing things up, now waiting for results queue to drain 26264 1727204281.38406: results queue empty 26264 1727204281.38407: checking for any_errors_fatal 26264 1727204281.38409: done checking for any_errors_fatal 26264 1727204281.38409: checking for max_fail_percentage 26264 1727204281.38410: done checking for max_fail_percentage 26264 1727204281.38411: checking to see if all hosts have failed and the running result is not ok 26264 1727204281.38412: done checking to see if all hosts have failed 26264 1727204281.38413: getting the remaining hosts for this loop 26264 1727204281.38414: done getting the remaining hosts for this loop 26264 1727204281.38417: getting the next task for host managed-node3 26264 1727204281.38420: done getting next task for host managed-node3 26264 1727204281.38421: ^ task is: None 26264 1727204281.38422: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204281.38423: done queuing things up, now waiting for results queue to drain 26264 1727204281.38424: results queue empty 26264 1727204281.38425: checking for any_errors_fatal 26264 1727204281.38426: done checking for any_errors_fatal 26264 1727204281.38427: checking for max_fail_percentage 26264 1727204281.38428: done checking for max_fail_percentage 26264 1727204281.38429: checking to see if all hosts have failed and the running result is not ok 26264 1727204281.38430: done checking to see if all hosts have failed 26264 1727204281.38431: getting the next task for host managed-node3 26264 1727204281.38433: done getting next task for host managed-node3 26264 1727204281.38434: ^ task is: None 26264 1727204281.38435: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204281.38477: in VariableManager get_vars() 26264 1727204281.38493: done with get_vars() 26264 1727204281.38499: in VariableManager get_vars() 26264 1727204281.38509: done with get_vars() 26264 1727204281.38514: variable 'omit' from source: magic vars 26264 1727204281.38543: in VariableManager get_vars() 26264 1727204281.38552: done with get_vars() 26264 1727204281.38576: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 26264 1727204281.38846: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 26264 1727204281.38871: getting the remaining hosts for this loop 26264 1727204281.38873: done getting the remaining hosts for this loop 26264 1727204281.38875: getting the next task for host managed-node3 26264 1727204281.38878: done getting next task for host managed-node3 26264 1727204281.38880: ^ task is: TASK: Gathering Facts 26264 1727204281.38881: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204281.38883: getting variables 26264 1727204281.38884: in VariableManager get_vars() 26264 1727204281.38893: Calling all_inventory to load vars for managed-node3 26264 1727204281.38895: Calling groups_inventory to load vars for managed-node3 26264 1727204281.38897: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204281.38905: Calling all_plugins_play to load vars for managed-node3 26264 1727204281.38908: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204281.38911: Calling groups_plugins_play to load vars for managed-node3 26264 1727204281.40298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204281.42050: done with get_vars() 26264 1727204281.42075: done getting variables 26264 1727204281.42124: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.228) 0:00:45.270 ***** 26264 1727204281.42150: entering _queue_task() for managed-node3/gather_facts 26264 1727204281.42504: worker is 1 (out of 1 available) 26264 1727204281.42515: exiting _queue_task() for managed-node3/gather_facts 26264 1727204281.42537: done queuing things up, now waiting for results queue to drain 26264 1727204281.42539: waiting for pending results... 26264 1727204281.42857: running TaskExecutor() for managed-node3/TASK: Gathering Facts 26264 1727204281.43036: in run() - task 0affcd87-79f5-5ff5-08b0-00000000056d 26264 1727204281.43066: variable 'ansible_search_path' from source: unknown 26264 1727204281.43124: calling self._execute() 26264 1727204281.43237: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204281.43249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204281.43267: variable 'omit' from source: magic vars 26264 1727204281.43726: variable 'ansible_distribution_major_version' from source: facts 26264 1727204281.43749: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204281.43766: variable 'omit' from source: magic vars 26264 1727204281.43799: variable 'omit' from source: magic vars 26264 1727204281.43841: variable 'omit' from source: magic vars 26264 1727204281.43900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204281.43945: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204281.43981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204281.44005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204281.44023: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204281.44068: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204281.44084: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204281.44093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204281.44207: Set connection var ansible_pipelining to False 26264 1727204281.44216: Set connection var ansible_connection to ssh 26264 1727204281.44223: Set connection var ansible_shell_type to sh 26264 1727204281.44233: Set connection var ansible_shell_executable to /bin/sh 26264 1727204281.44245: Set connection var ansible_timeout to 10 26264 1727204281.44257: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204281.44293: variable 'ansible_shell_executable' from source: unknown 26264 1727204281.44306: variable 'ansible_connection' from source: unknown 26264 1727204281.44313: variable 'ansible_module_compression' from source: unknown 26264 1727204281.44321: variable 'ansible_shell_type' from source: unknown 26264 1727204281.44328: variable 'ansible_shell_executable' from source: unknown 26264 1727204281.44340: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204281.44349: variable 'ansible_pipelining' from source: unknown 26264 1727204281.44356: variable 'ansible_timeout' from source: unknown 26264 1727204281.44363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204281.44562: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204281.44584: variable 'omit' from source: magic vars 26264 1727204281.44594: starting attempt loop 26264 1727204281.44600: running the handler 26264 1727204281.44634: variable 'ansible_facts' from source: unknown 26264 1727204281.44658: _low_level_execute_command(): starting 26264 1727204281.44672: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204281.46717: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204281.46722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204281.46773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 26264 1727204281.46777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204281.46789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204281.47044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204281.48289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204281.48293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204281.48351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204281.49991: stdout chunk (state=3): >>>/root <<< 26264 1727204281.50113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204281.50185: stderr chunk (state=3): >>><<< 26264 1727204281.50188: stdout chunk (state=3): >>><<< 26264 1727204281.50278: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204281.50282: _low_level_execute_command(): starting 26264 1727204281.50285: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204281.5021143-29716-58978004591236 `" && echo ansible-tmp-1727204281.5021143-29716-58978004591236="` echo /root/.ansible/tmp/ansible-tmp-1727204281.5021143-29716-58978004591236 `" ) && sleep 0' 26264 1727204281.51936: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204281.51940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204281.51969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204281.51981: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204281.51983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204281.52000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204281.52046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204281.52050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204281.52052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204281.52112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204281.53978: stdout chunk (state=3): >>>ansible-tmp-1727204281.5021143-29716-58978004591236=/root/.ansible/tmp/ansible-tmp-1727204281.5021143-29716-58978004591236 <<< 26264 1727204281.54175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204281.54179: stdout chunk (state=3): >>><<< 26264 1727204281.54181: stderr chunk (state=3): >>><<< 26264 1727204281.54206: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204281.5021143-29716-58978004591236=/root/.ansible/tmp/ansible-tmp-1727204281.5021143-29716-58978004591236 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204281.54570: variable 'ansible_module_compression' from source: unknown 26264 1727204281.54573: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 26264 1727204281.54576: variable 'ansible_facts' from source: unknown 26264 1727204281.54578: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204281.5021143-29716-58978004591236/AnsiballZ_setup.py 26264 1727204281.54868: Sending initial data 26264 1727204281.54879: Sent initial data (153 bytes) 26264 1727204281.56325: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204281.56329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204281.56361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204281.56367: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204281.56370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 26264 1727204281.56372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204281.56509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204281.56605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204281.56638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204281.58389: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204281.58435: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204281.58458: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmp69nmm8tz /root/.ansible/tmp/ansible-tmp-1727204281.5021143-29716-58978004591236/AnsiballZ_setup.py <<< 26264 1727204281.58476: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204281.61749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204281.61754: stderr chunk (state=3): >>><<< 26264 1727204281.61756: stdout chunk (state=3): >>><<< 26264 1727204281.61758: done transferring module to remote 26264 1727204281.61760: _low_level_execute_command(): starting 26264 1727204281.61763: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204281.5021143-29716-58978004591236/ /root/.ansible/tmp/ansible-tmp-1727204281.5021143-29716-58978004591236/AnsiballZ_setup.py && sleep 0' 26264 1727204281.64284: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204281.64488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204281.64506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204281.64525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204281.64580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204281.64592: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204281.64605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204281.64624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204281.64635: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204281.64644: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204281.64659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204281.64674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204281.64689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204281.64713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204281.64725: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204281.64737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204281.65024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204281.65055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204281.65058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204281.65286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204281.66988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204281.67044: stderr chunk (state=3): >>><<< 26264 1727204281.67047: stdout chunk (state=3): >>><<< 26264 1727204281.67143: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204281.67146: _low_level_execute_command(): starting 26264 1727204281.67149: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204281.5021143-29716-58978004591236/AnsiballZ_setup.py && sleep 0' 26264 1727204281.68743: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204281.68759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204281.68778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204281.68795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204281.68841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204281.68852: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204281.68868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204281.68886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204281.68897: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204281.68906: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204281.68917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204281.68929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204281.68943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204281.68953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204281.68966: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204281.68980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204281.69057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204281.69184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204281.69198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204281.69437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204282.19117: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJ<<< 26264 1727204282.19170: stdout chunk (state=3): >>>nwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2833, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 699, "free": 2833}, "nocache": {"free": 3293, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 627, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264280043520, "block_size": 4096, "block_total": 65519355, "block_available": 64521495, "block_used": 997860, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "58", "second": "02", "epoch": "1727204282", "epoch_int": "1727204282", "date": "2024-09-24", "time": "14:58:02", "iso8601_micro": "2024-09-24T18:58:02.151455Z", "iso8601": "2024-09-24T18:58:02Z", "iso8601_basic": "20240924T145802151455", "iso8601_basic_short": "20240924T145802", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.48, "5m": 0.39, "15m": 0.2}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 26264 1727204282.20791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204282.20871: stderr chunk (state=3): >>><<< 26264 1727204282.20876: stdout chunk (state=3): >>><<< 26264 1727204282.21175: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2833, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 699, "free": 2833}, "nocache": {"free": 3293, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 627, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264280043520, "block_size": 4096, "block_total": 65519355, "block_available": 64521495, "block_used": 997860, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "58", "second": "02", "epoch": "1727204282", "epoch_int": "1727204282", "date": "2024-09-24", "time": "14:58:02", "iso8601_micro": "2024-09-24T18:58:02.151455Z", "iso8601": "2024-09-24T18:58:02Z", "iso8601_basic": "20240924T145802151455", "iso8601_basic_short": "20240924T145802", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.48, "5m": 0.39, "15m": 0.2}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204282.21528: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204281.5021143-29716-58978004591236/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204282.21559: _low_level_execute_command(): starting 26264 1727204282.21570: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204281.5021143-29716-58978004591236/ > /dev/null 2>&1 && sleep 0' 26264 1727204282.22401: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204282.22416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204282.22431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204282.22450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204282.22502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204282.22513: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204282.22525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204282.22541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204282.22554: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204282.22566: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204282.22578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204282.22595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204282.22610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204282.22620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204282.22629: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204282.22641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204282.22727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204282.22751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204282.22768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204282.22843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204282.24693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204282.24727: stderr chunk (state=3): >>><<< 26264 1727204282.24731: stdout chunk (state=3): >>><<< 26264 1727204282.24966: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204282.24970: handler run complete 26264 1727204282.24972: variable 'ansible_facts' from source: unknown 26264 1727204282.25015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204282.25361: variable 'ansible_facts' from source: unknown 26264 1727204282.25567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204282.25734: attempt loop complete, returning result 26264 1727204282.25744: _execute() done 26264 1727204282.25754: dumping result to json 26264 1727204282.25796: done dumping result, returning 26264 1727204282.25809: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-5ff5-08b0-00000000056d] 26264 1727204282.25819: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000056d ok: [managed-node3] 26264 1727204282.26544: no more pending results, returning what we have 26264 1727204282.26551: results queue empty 26264 1727204282.26584: checking for any_errors_fatal 26264 1727204282.26586: done checking for any_errors_fatal 26264 1727204282.26587: checking for max_fail_percentage 26264 1727204282.26589: done checking for max_fail_percentage 26264 1727204282.26590: checking to see if all hosts have failed and the running result is not ok 26264 1727204282.26592: done checking to see if all hosts have failed 26264 1727204282.26593: getting the remaining hosts for this loop 26264 1727204282.26595: done getting the remaining hosts for this loop 26264 1727204282.26600: getting the next task for host managed-node3 26264 1727204282.26606: done getting next task for host managed-node3 26264 1727204282.26608: ^ task is: TASK: meta (flush_handlers) 26264 1727204282.26610: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204282.26615: getting variables 26264 1727204282.26616: in VariableManager get_vars() 26264 1727204282.26645: Calling all_inventory to load vars for managed-node3 26264 1727204282.26651: Calling groups_inventory to load vars for managed-node3 26264 1727204282.26656: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204282.26670: Calling all_plugins_play to load vars for managed-node3 26264 1727204282.26673: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204282.26677: Calling groups_plugins_play to load vars for managed-node3 26264 1727204282.28430: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000056d 26264 1727204282.28435: WORKER PROCESS EXITING 26264 1727204282.30636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204282.37738: done with get_vars() 26264 1727204282.37767: done getting variables 26264 1727204282.37828: in VariableManager get_vars() 26264 1727204282.37837: Calling all_inventory to load vars for managed-node3 26264 1727204282.37840: Calling groups_inventory to load vars for managed-node3 26264 1727204282.37842: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204282.37847: Calling all_plugins_play to load vars for managed-node3 26264 1727204282.37852: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204282.37860: Calling groups_plugins_play to load vars for managed-node3 26264 1727204282.39140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204282.40981: done with get_vars() 26264 1727204282.41015: done queuing things up, now waiting for results queue to drain 26264 1727204282.41018: results queue empty 26264 1727204282.41019: checking for any_errors_fatal 26264 1727204282.41023: done checking for any_errors_fatal 26264 1727204282.41024: checking for max_fail_percentage 26264 1727204282.41025: done checking for max_fail_percentage 26264 1727204282.41026: checking to see if all hosts have failed and the running result is not ok 26264 1727204282.41027: done checking to see if all hosts have failed 26264 1727204282.41028: getting the remaining hosts for this loop 26264 1727204282.41029: done getting the remaining hosts for this loop 26264 1727204282.41032: getting the next task for host managed-node3 26264 1727204282.41036: done getting next task for host managed-node3 26264 1727204282.41038: ^ task is: TASK: Verify network state restored to default 26264 1727204282.41039: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204282.41042: getting variables 26264 1727204282.41043: in VariableManager get_vars() 26264 1727204282.41054: Calling all_inventory to load vars for managed-node3 26264 1727204282.41057: Calling groups_inventory to load vars for managed-node3 26264 1727204282.41059: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204282.41067: Calling all_plugins_play to load vars for managed-node3 26264 1727204282.41070: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204282.41073: Calling groups_plugins_play to load vars for managed-node3 26264 1727204282.43384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204282.45321: done with get_vars() 26264 1727204282.45347: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:80 Tuesday 24 September 2024 14:58:02 -0400 (0:00:01.032) 0:00:46.303 ***** 26264 1727204282.45428: entering _queue_task() for managed-node3/include_tasks 26264 1727204282.45776: worker is 1 (out of 1 available) 26264 1727204282.45787: exiting _queue_task() for managed-node3/include_tasks 26264 1727204282.45800: done queuing things up, now waiting for results queue to drain 26264 1727204282.45802: waiting for pending results... 26264 1727204282.46096: running TaskExecutor() for managed-node3/TASK: Verify network state restored to default 26264 1727204282.46224: in run() - task 0affcd87-79f5-5ff5-08b0-000000000078 26264 1727204282.46245: variable 'ansible_search_path' from source: unknown 26264 1727204282.46292: calling self._execute() 26264 1727204282.46408: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204282.46419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204282.46434: variable 'omit' from source: magic vars 26264 1727204282.46859: variable 'ansible_distribution_major_version' from source: facts 26264 1727204282.46882: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204282.46894: _execute() done 26264 1727204282.46904: dumping result to json 26264 1727204282.46912: done dumping result, returning 26264 1727204282.46921: done running TaskExecutor() for managed-node3/TASK: Verify network state restored to default [0affcd87-79f5-5ff5-08b0-000000000078] 26264 1727204282.46932: sending task result for task 0affcd87-79f5-5ff5-08b0-000000000078 26264 1727204282.47074: no more pending results, returning what we have 26264 1727204282.47079: in VariableManager get_vars() 26264 1727204282.47114: Calling all_inventory to load vars for managed-node3 26264 1727204282.47117: Calling groups_inventory to load vars for managed-node3 26264 1727204282.47121: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204282.47137: Calling all_plugins_play to load vars for managed-node3 26264 1727204282.47140: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204282.47144: Calling groups_plugins_play to load vars for managed-node3 26264 1727204282.48408: done sending task result for task 0affcd87-79f5-5ff5-08b0-000000000078 26264 1727204282.48411: WORKER PROCESS EXITING 26264 1727204282.48967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204282.50662: done with get_vars() 26264 1727204282.50689: variable 'ansible_search_path' from source: unknown 26264 1727204282.50706: we have included files to process 26264 1727204282.50707: generating all_blocks data 26264 1727204282.50708: done generating all_blocks data 26264 1727204282.50709: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 26264 1727204282.50710: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 26264 1727204282.50712: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 26264 1727204282.51127: done processing included file 26264 1727204282.51130: iterating over new_blocks loaded from include file 26264 1727204282.51131: in VariableManager get_vars() 26264 1727204282.51143: done with get_vars() 26264 1727204282.51145: filtering new block on tags 26264 1727204282.51166: done filtering new block on tags 26264 1727204282.51169: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node3 26264 1727204282.51175: extending task lists for all hosts with included blocks 26264 1727204282.51206: done extending task lists 26264 1727204282.51207: done processing included files 26264 1727204282.51208: results queue empty 26264 1727204282.51208: checking for any_errors_fatal 26264 1727204282.51210: done checking for any_errors_fatal 26264 1727204282.51211: checking for max_fail_percentage 26264 1727204282.51212: done checking for max_fail_percentage 26264 1727204282.51213: checking to see if all hosts have failed and the running result is not ok 26264 1727204282.51213: done checking to see if all hosts have failed 26264 1727204282.51214: getting the remaining hosts for this loop 26264 1727204282.51215: done getting the remaining hosts for this loop 26264 1727204282.51218: getting the next task for host managed-node3 26264 1727204282.51221: done getting next task for host managed-node3 26264 1727204282.51223: ^ task is: TASK: Check routes and DNS 26264 1727204282.51225: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204282.51228: getting variables 26264 1727204282.51229: in VariableManager get_vars() 26264 1727204282.51237: Calling all_inventory to load vars for managed-node3 26264 1727204282.51239: Calling groups_inventory to load vars for managed-node3 26264 1727204282.51241: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204282.51246: Calling all_plugins_play to load vars for managed-node3 26264 1727204282.51251: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204282.51254: Calling groups_plugins_play to load vars for managed-node3 26264 1727204282.52603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204282.54324: done with get_vars() 26264 1727204282.54355: done getting variables 26264 1727204282.54402: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:58:02 -0400 (0:00:00.090) 0:00:46.393 ***** 26264 1727204282.54434: entering _queue_task() for managed-node3/shell 26264 1727204282.54783: worker is 1 (out of 1 available) 26264 1727204282.54796: exiting _queue_task() for managed-node3/shell 26264 1727204282.54808: done queuing things up, now waiting for results queue to drain 26264 1727204282.54810: waiting for pending results... 26264 1727204282.55101: running TaskExecutor() for managed-node3/TASK: Check routes and DNS 26264 1727204282.55220: in run() - task 0affcd87-79f5-5ff5-08b0-00000000057e 26264 1727204282.55240: variable 'ansible_search_path' from source: unknown 26264 1727204282.55251: variable 'ansible_search_path' from source: unknown 26264 1727204282.55294: calling self._execute() 26264 1727204282.55401: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204282.55411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204282.55423: variable 'omit' from source: magic vars 26264 1727204282.55818: variable 'ansible_distribution_major_version' from source: facts 26264 1727204282.55835: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204282.55845: variable 'omit' from source: magic vars 26264 1727204282.55898: variable 'omit' from source: magic vars 26264 1727204282.55940: variable 'omit' from source: magic vars 26264 1727204282.55989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204282.56038: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204282.56069: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204282.56091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204282.56106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204282.56143: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204282.56154: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204282.56162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204282.56279: Set connection var ansible_pipelining to False 26264 1727204282.56288: Set connection var ansible_connection to ssh 26264 1727204282.56293: Set connection var ansible_shell_type to sh 26264 1727204282.56303: Set connection var ansible_shell_executable to /bin/sh 26264 1727204282.56313: Set connection var ansible_timeout to 10 26264 1727204282.56324: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204282.56358: variable 'ansible_shell_executable' from source: unknown 26264 1727204282.56368: variable 'ansible_connection' from source: unknown 26264 1727204282.56375: variable 'ansible_module_compression' from source: unknown 26264 1727204282.56381: variable 'ansible_shell_type' from source: unknown 26264 1727204282.56387: variable 'ansible_shell_executable' from source: unknown 26264 1727204282.56393: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204282.56399: variable 'ansible_pipelining' from source: unknown 26264 1727204282.56405: variable 'ansible_timeout' from source: unknown 26264 1727204282.56411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204282.56561: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204282.56579: variable 'omit' from source: magic vars 26264 1727204282.56590: starting attempt loop 26264 1727204282.56596: running the handler 26264 1727204282.56608: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204282.56629: _low_level_execute_command(): starting 26264 1727204282.56640: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204282.57433: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204282.57447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204282.57471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204282.57490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204282.57533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204282.57553: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204282.57570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204282.57587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204282.57598: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204282.57607: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204282.57618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204282.57632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204282.57652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204282.57666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204282.57677: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204282.57689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204282.57774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204282.57791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204282.57806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204282.57991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204282.59570: stdout chunk (state=3): >>>/root <<< 26264 1727204282.59775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204282.59778: stdout chunk (state=3): >>><<< 26264 1727204282.59781: stderr chunk (state=3): >>><<< 26264 1727204282.59872: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204282.59877: _low_level_execute_command(): starting 26264 1727204282.59888: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204282.5980241-29917-19880795767695 `" && echo ansible-tmp-1727204282.5980241-29917-19880795767695="` echo /root/.ansible/tmp/ansible-tmp-1727204282.5980241-29917-19880795767695 `" ) && sleep 0' 26264 1727204282.61458: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204282.61585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204282.61594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204282.61609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204282.61654: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204282.61661: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204282.61674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204282.61688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204282.61695: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204282.61702: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204282.61710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204282.61721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204282.61731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204282.61746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204282.61778: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204282.61788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204282.61863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204282.61995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204282.62008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204282.62083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204282.63925: stdout chunk (state=3): >>>ansible-tmp-1727204282.5980241-29917-19880795767695=/root/.ansible/tmp/ansible-tmp-1727204282.5980241-29917-19880795767695 <<< 26264 1727204282.64106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204282.64110: stdout chunk (state=3): >>><<< 26264 1727204282.64115: stderr chunk (state=3): >>><<< 26264 1727204282.64271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204282.5980241-29917-19880795767695=/root/.ansible/tmp/ansible-tmp-1727204282.5980241-29917-19880795767695 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204282.64274: variable 'ansible_module_compression' from source: unknown 26264 1727204282.64276: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 26264 1727204282.64278: variable 'ansible_facts' from source: unknown 26264 1727204282.64336: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204282.5980241-29917-19880795767695/AnsiballZ_command.py 26264 1727204282.64940: Sending initial data 26264 1727204282.64943: Sent initial data (155 bytes) 26264 1727204282.69054: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204282.69083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204282.69100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204282.69287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204282.69334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204282.69346: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204282.69360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204282.69386: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204282.69400: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204282.69412: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204282.69427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204282.69439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204282.69454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204282.69469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204282.69480: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204282.69492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204282.69577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204282.69599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204282.69614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204282.69687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204282.71407: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 26264 1727204282.71432: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204282.71460: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204282.71463: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpgevisr24 /root/.ansible/tmp/ansible-tmp-1727204282.5980241-29917-19880795767695/AnsiballZ_command.py <<< 26264 1727204282.71502: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204282.72762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204282.72942: stderr chunk (state=3): >>><<< 26264 1727204282.72946: stdout chunk (state=3): >>><<< 26264 1727204282.72951: done transferring module to remote 26264 1727204282.72954: _low_level_execute_command(): starting 26264 1727204282.72956: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204282.5980241-29917-19880795767695/ /root/.ansible/tmp/ansible-tmp-1727204282.5980241-29917-19880795767695/AnsiballZ_command.py && sleep 0' 26264 1727204282.74390: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204282.74413: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204282.74431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204282.74451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204282.74507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204282.74524: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204282.74539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204282.74558: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204282.74574: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204282.74585: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204282.74597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204282.74610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204282.74633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204282.74646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204282.74658: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204282.74676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204282.74784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204282.74802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204282.74817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204282.74944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204282.76707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204282.76711: stdout chunk (state=3): >>><<< 26264 1727204282.76713: stderr chunk (state=3): >>><<< 26264 1727204282.76820: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204282.76824: _low_level_execute_command(): starting 26264 1727204282.76828: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204282.5980241-29917-19880795767695/AnsiballZ_command.py && sleep 0' 26264 1727204282.78359: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204282.78369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204282.78379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204282.78387: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204282.78398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204282.78412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204282.78419: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204282.78427: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204282.78436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204282.78445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204282.78457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204282.78466: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204282.78478: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204282.78487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204282.78561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204282.79291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204282.79303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204282.79386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204282.93286: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:f5:d7:be:93 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.87/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2985sec preferred_lft 2985sec\n inet6 fe80::8ff:f5ff:fed7:be93/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:58:02.923390", "end": "2024-09-24 14:58:02.931484", "delta": "0:00:00.008094", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 26264 1727204282.94509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204282.94513: stdout chunk (state=3): >>><<< 26264 1727204282.94515: stderr chunk (state=3): >>><<< 26264 1727204282.94672: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:f5:d7:be:93 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.87/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2985sec preferred_lft 2985sec\n inet6 fe80::8ff:f5ff:fed7:be93/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:58:02.923390", "end": "2024-09-24 14:58:02.931484", "delta": "0:00:00.008094", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204282.94677: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204282.5980241-29917-19880795767695/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204282.94680: _low_level_execute_command(): starting 26264 1727204282.94682: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204282.5980241-29917-19880795767695/ > /dev/null 2>&1 && sleep 0' 26264 1727204282.96044: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204282.96287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204282.96306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204282.96323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204282.96371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204282.96385: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204282.96399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204282.96416: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204282.96428: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204282.96438: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204282.96448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204282.96460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204282.96478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204282.96490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204282.96501: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204282.96514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204282.96700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204282.96724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204282.96742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204282.96819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204282.98654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204282.98658: stdout chunk (state=3): >>><<< 26264 1727204282.98661: stderr chunk (state=3): >>><<< 26264 1727204282.99125: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204282.99133: handler run complete 26264 1727204282.99136: Evaluated conditional (False): False 26264 1727204282.99138: attempt loop complete, returning result 26264 1727204282.99140: _execute() done 26264 1727204282.99142: dumping result to json 26264 1727204282.99144: done dumping result, returning 26264 1727204282.99146: done running TaskExecutor() for managed-node3/TASK: Check routes and DNS [0affcd87-79f5-5ff5-08b0-00000000057e] 26264 1727204282.99151: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000057e 26264 1727204282.99230: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000057e 26264 1727204282.99234: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008094", "end": "2024-09-24 14:58:02.931484", "rc": 0, "start": "2024-09-24 14:58:02.923390" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:f5:d7:be:93 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.15.87/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 2985sec preferred_lft 2985sec inet6 fe80::8ff:f5ff:fed7:be93/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 26264 1727204282.99307: no more pending results, returning what we have 26264 1727204282.99310: results queue empty 26264 1727204282.99311: checking for any_errors_fatal 26264 1727204282.99312: done checking for any_errors_fatal 26264 1727204282.99313: checking for max_fail_percentage 26264 1727204282.99315: done checking for max_fail_percentage 26264 1727204282.99315: checking to see if all hosts have failed and the running result is not ok 26264 1727204282.99317: done checking to see if all hosts have failed 26264 1727204282.99317: getting the remaining hosts for this loop 26264 1727204282.99319: done getting the remaining hosts for this loop 26264 1727204282.99323: getting the next task for host managed-node3 26264 1727204282.99328: done getting next task for host managed-node3 26264 1727204282.99331: ^ task is: TASK: Verify DNS and network connectivity 26264 1727204282.99334: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204282.99338: getting variables 26264 1727204282.99339: in VariableManager get_vars() 26264 1727204282.99372: Calling all_inventory to load vars for managed-node3 26264 1727204282.99375: Calling groups_inventory to load vars for managed-node3 26264 1727204282.99378: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204282.99389: Calling all_plugins_play to load vars for managed-node3 26264 1727204282.99391: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204282.99394: Calling groups_plugins_play to load vars for managed-node3 26264 1727204283.01625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204283.03611: done with get_vars() 26264 1727204283.03643: done getting variables 26264 1727204283.03728: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:58:03 -0400 (0:00:00.493) 0:00:46.886 ***** 26264 1727204283.03768: entering _queue_task() for managed-node3/shell 26264 1727204283.04624: worker is 1 (out of 1 available) 26264 1727204283.04636: exiting _queue_task() for managed-node3/shell 26264 1727204283.04653: done queuing things up, now waiting for results queue to drain 26264 1727204283.04655: waiting for pending results... 26264 1727204283.05643: running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity 26264 1727204283.06037: in run() - task 0affcd87-79f5-5ff5-08b0-00000000057f 26264 1727204283.06123: variable 'ansible_search_path' from source: unknown 26264 1727204283.06128: variable 'ansible_search_path' from source: unknown 26264 1727204283.06166: calling self._execute() 26264 1727204283.06293: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204283.06296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204283.06308: variable 'omit' from source: magic vars 26264 1727204283.06758: variable 'ansible_distribution_major_version' from source: facts 26264 1727204283.06783: Evaluated conditional (ansible_distribution_major_version != '6'): True 26264 1727204283.06941: variable 'ansible_facts' from source: unknown 26264 1727204283.07792: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 26264 1727204283.07805: variable 'omit' from source: magic vars 26264 1727204283.07868: variable 'omit' from source: magic vars 26264 1727204283.07919: variable 'omit' from source: magic vars 26264 1727204283.08052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26264 1727204283.08134: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26264 1727204283.08163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26264 1727204283.08193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204283.08210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26264 1727204283.08249: variable 'inventory_hostname' from source: host vars for 'managed-node3' 26264 1727204283.08265: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204283.08274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204283.08394: Set connection var ansible_pipelining to False 26264 1727204283.08405: Set connection var ansible_connection to ssh 26264 1727204283.08411: Set connection var ansible_shell_type to sh 26264 1727204283.08420: Set connection var ansible_shell_executable to /bin/sh 26264 1727204283.08433: Set connection var ansible_timeout to 10 26264 1727204283.08444: Set connection var ansible_module_compression to ZIP_DEFLATED 26264 1727204283.08487: variable 'ansible_shell_executable' from source: unknown 26264 1727204283.08496: variable 'ansible_connection' from source: unknown 26264 1727204283.08504: variable 'ansible_module_compression' from source: unknown 26264 1727204283.08513: variable 'ansible_shell_type' from source: unknown 26264 1727204283.08519: variable 'ansible_shell_executable' from source: unknown 26264 1727204283.08524: variable 'ansible_host' from source: host vars for 'managed-node3' 26264 1727204283.08531: variable 'ansible_pipelining' from source: unknown 26264 1727204283.08537: variable 'ansible_timeout' from source: unknown 26264 1727204283.08546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 26264 1727204283.08713: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204283.08734: variable 'omit' from source: magic vars 26264 1727204283.08744: starting attempt loop 26264 1727204283.08753: running the handler 26264 1727204283.08768: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 26264 1727204283.08801: _low_level_execute_command(): starting 26264 1727204283.08818: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26264 1727204283.09677: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204283.09971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204283.10020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204283.10035: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204283.10052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204283.10082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204283.10111: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204283.10129: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204283.10150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204283.10175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204283.10203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204283.10232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204283.10252: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204283.10283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204283.10375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204283.10411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204283.10437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204283.10532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204283.12149: stdout chunk (state=3): >>>/root <<< 26264 1727204283.12257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204283.12362: stderr chunk (state=3): >>><<< 26264 1727204283.12384: stdout chunk (state=3): >>><<< 26264 1727204283.12531: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204283.12543: _low_level_execute_command(): starting 26264 1727204283.12547: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204283.1242197-29944-152990570849796 `" && echo ansible-tmp-1727204283.1242197-29944-152990570849796="` echo /root/.ansible/tmp/ansible-tmp-1727204283.1242197-29944-152990570849796 `" ) && sleep 0' 26264 1727204283.13373: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204283.13399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204283.13416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204283.13434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204283.13479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204283.13498: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204283.13517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204283.13534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204283.13546: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204283.13556: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204283.13571: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204283.13585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204283.13607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204283.13623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204283.13636: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204283.13650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204283.13738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204283.13762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204283.13781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204283.13860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204283.15693: stdout chunk (state=3): >>>ansible-tmp-1727204283.1242197-29944-152990570849796=/root/.ansible/tmp/ansible-tmp-1727204283.1242197-29944-152990570849796 <<< 26264 1727204283.15797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204283.15905: stderr chunk (state=3): >>><<< 26264 1727204283.15920: stdout chunk (state=3): >>><<< 26264 1727204283.16070: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204283.1242197-29944-152990570849796=/root/.ansible/tmp/ansible-tmp-1727204283.1242197-29944-152990570849796 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204283.16074: variable 'ansible_module_compression' from source: unknown 26264 1727204283.16076: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26264m9ad_7b5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 26264 1727204283.16170: variable 'ansible_facts' from source: unknown 26264 1727204283.16203: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204283.1242197-29944-152990570849796/AnsiballZ_command.py 26264 1727204283.16374: Sending initial data 26264 1727204283.16377: Sent initial data (156 bytes) 26264 1727204283.17458: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204283.17480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204283.17504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204283.17526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204283.17579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204283.17593: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204283.17614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204283.17636: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204283.17651: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204283.17663: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204283.17678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204283.17691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204283.17708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204283.17728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204283.17743: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204283.17762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204283.17853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204283.17881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204283.17898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204283.17981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204283.19694: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26264 1727204283.19762: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26264 1727204283.19794: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26264m9ad_7b5/tmpqxe2rwxn /root/.ansible/tmp/ansible-tmp-1727204283.1242197-29944-152990570849796/AnsiballZ_command.py <<< 26264 1727204283.19803: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26264 1727204283.20881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204283.21171: stderr chunk (state=3): >>><<< 26264 1727204283.21175: stdout chunk (state=3): >>><<< 26264 1727204283.21178: done transferring module to remote 26264 1727204283.21180: _low_level_execute_command(): starting 26264 1727204283.21183: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204283.1242197-29944-152990570849796/ /root/.ansible/tmp/ansible-tmp-1727204283.1242197-29944-152990570849796/AnsiballZ_command.py && sleep 0' 26264 1727204283.21893: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204283.21909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204283.21924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204283.21942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204283.22001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204283.22015: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204283.22031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204283.22052: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204283.22076: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204283.22087: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204283.22097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204283.22110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204283.22124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204283.22134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204283.22144: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204283.22159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204283.22244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204283.22272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204283.22298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204283.22373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204283.24167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204283.24171: stdout chunk (state=3): >>><<< 26264 1727204283.24174: stderr chunk (state=3): >>><<< 26264 1727204283.24277: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204283.24281: _low_level_execute_command(): starting 26264 1727204283.24286: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204283.1242197-29944-152990570849796/AnsiballZ_command.py && sleep 0' 26264 1727204283.24931: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204283.24973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204283.24988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204283.25020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204283.25110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204283.25121: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204283.25135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204283.25179: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204283.25204: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204283.25280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204283.25290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204283.25355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204283.82682: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1418 0 --:--:-- --:--:-- --:--:-- 1418\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1426 0 --:--:-- --:--:-- --:--:-- 1426", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:58:03.381918", "end": "2024-09-24 14:58:03.825416", "delta": "0:00:00.443498", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 26264 1727204283.84058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 26264 1727204283.84062: stderr chunk (state=3): >>><<< 26264 1727204283.84068: stdout chunk (state=3): >>><<< 26264 1727204283.84097: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1418 0 --:--:-- --:--:-- --:--:-- 1418\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1426 0 --:--:-- --:--:-- --:--:-- 1426", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:58:03.381918", "end": "2024-09-24 14:58:03.825416", "delta": "0:00:00.443498", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 26264 1727204283.84142: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204283.1242197-29944-152990570849796/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26264 1727204283.84154: _low_level_execute_command(): starting 26264 1727204283.84162: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204283.1242197-29944-152990570849796/ > /dev/null 2>&1 && sleep 0' 26264 1727204283.84882: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26264 1727204283.84892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204283.84909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204283.84922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204283.84967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204283.84971: stderr chunk (state=3): >>>debug2: match not found <<< 26264 1727204283.84982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204283.84996: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26264 1727204283.85005: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 26264 1727204283.85018: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26264 1727204283.85027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26264 1727204283.85035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26264 1727204283.85046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26264 1727204283.85056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 26264 1727204283.85063: stderr chunk (state=3): >>>debug2: match found <<< 26264 1727204283.85075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26264 1727204283.85155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26264 1727204283.85171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26264 1727204283.85177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26264 1727204283.85288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26264 1727204283.87017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26264 1727204283.87110: stderr chunk (state=3): >>><<< 26264 1727204283.87113: stdout chunk (state=3): >>><<< 26264 1727204283.87375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26264 1727204283.87378: handler run complete 26264 1727204283.87381: Evaluated conditional (False): False 26264 1727204283.87383: attempt loop complete, returning result 26264 1727204283.87385: _execute() done 26264 1727204283.87387: dumping result to json 26264 1727204283.87388: done dumping result, returning 26264 1727204283.87390: done running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity [0affcd87-79f5-5ff5-08b0-00000000057f] 26264 1727204283.87392: sending task result for task 0affcd87-79f5-5ff5-08b0-00000000057f 26264 1727204283.87468: done sending task result for task 0affcd87-79f5-5ff5-08b0-00000000057f 26264 1727204283.87472: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.443498", "end": "2024-09-24 14:58:03.825416", "rc": 0, "start": "2024-09-24 14:58:03.381918" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1418 0 --:--:-- --:--:-- --:--:-- 1418 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1426 0 --:--:-- --:--:-- --:--:-- 1426 26264 1727204283.87556: no more pending results, returning what we have 26264 1727204283.87561: results queue empty 26264 1727204283.87561: checking for any_errors_fatal 26264 1727204283.87573: done checking for any_errors_fatal 26264 1727204283.87573: checking for max_fail_percentage 26264 1727204283.87575: done checking for max_fail_percentage 26264 1727204283.87576: checking to see if all hosts have failed and the running result is not ok 26264 1727204283.87578: done checking to see if all hosts have failed 26264 1727204283.87578: getting the remaining hosts for this loop 26264 1727204283.87580: done getting the remaining hosts for this loop 26264 1727204283.87586: getting the next task for host managed-node3 26264 1727204283.87594: done getting next task for host managed-node3 26264 1727204283.87596: ^ task is: TASK: meta (flush_handlers) 26264 1727204283.87599: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204283.87603: getting variables 26264 1727204283.87605: in VariableManager get_vars() 26264 1727204283.87637: Calling all_inventory to load vars for managed-node3 26264 1727204283.87640: Calling groups_inventory to load vars for managed-node3 26264 1727204283.87644: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204283.87657: Calling all_plugins_play to load vars for managed-node3 26264 1727204283.87660: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204283.87663: Calling groups_plugins_play to load vars for managed-node3 26264 1727204283.88946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204283.89915: done with get_vars() 26264 1727204283.89937: done getting variables 26264 1727204283.89997: in VariableManager get_vars() 26264 1727204283.90005: Calling all_inventory to load vars for managed-node3 26264 1727204283.90006: Calling groups_inventory to load vars for managed-node3 26264 1727204283.90008: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204283.90011: Calling all_plugins_play to load vars for managed-node3 26264 1727204283.90013: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204283.90014: Calling groups_plugins_play to load vars for managed-node3 26264 1727204283.91242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204283.92700: done with get_vars() 26264 1727204283.92733: done queuing things up, now waiting for results queue to drain 26264 1727204283.92735: results queue empty 26264 1727204283.92735: checking for any_errors_fatal 26264 1727204283.92738: done checking for any_errors_fatal 26264 1727204283.92739: checking for max_fail_percentage 26264 1727204283.92739: done checking for max_fail_percentage 26264 1727204283.92740: checking to see if all hosts have failed and the running result is not ok 26264 1727204283.92740: done checking to see if all hosts have failed 26264 1727204283.92741: getting the remaining hosts for this loop 26264 1727204283.92742: done getting the remaining hosts for this loop 26264 1727204283.92744: getting the next task for host managed-node3 26264 1727204283.92747: done getting next task for host managed-node3 26264 1727204283.92750: ^ task is: TASK: meta (flush_handlers) 26264 1727204283.92751: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204283.92757: getting variables 26264 1727204283.92758: in VariableManager get_vars() 26264 1727204283.92768: Calling all_inventory to load vars for managed-node3 26264 1727204283.92769: Calling groups_inventory to load vars for managed-node3 26264 1727204283.92771: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204283.92776: Calling all_plugins_play to load vars for managed-node3 26264 1727204283.92777: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204283.92779: Calling groups_plugins_play to load vars for managed-node3 26264 1727204283.93513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204283.94809: done with get_vars() 26264 1727204283.94841: done getting variables 26264 1727204283.94904: in VariableManager get_vars() 26264 1727204283.94916: Calling all_inventory to load vars for managed-node3 26264 1727204283.94918: Calling groups_inventory to load vars for managed-node3 26264 1727204283.94924: Calling all_plugins_inventory to load vars for managed-node3 26264 1727204283.94929: Calling all_plugins_play to load vars for managed-node3 26264 1727204283.94932: Calling groups_plugins_inventory to load vars for managed-node3 26264 1727204283.94934: Calling groups_plugins_play to load vars for managed-node3 26264 1727204283.96166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26264 1727204283.97358: done with get_vars() 26264 1727204283.97391: done queuing things up, now waiting for results queue to drain 26264 1727204283.97393: results queue empty 26264 1727204283.97394: checking for any_errors_fatal 26264 1727204283.97395: done checking for any_errors_fatal 26264 1727204283.97395: checking for max_fail_percentage 26264 1727204283.97396: done checking for max_fail_percentage 26264 1727204283.97397: checking to see if all hosts have failed and the running result is not ok 26264 1727204283.97397: done checking to see if all hosts have failed 26264 1727204283.97397: getting the remaining hosts for this loop 26264 1727204283.97398: done getting the remaining hosts for this loop 26264 1727204283.97400: getting the next task for host managed-node3 26264 1727204283.97403: done getting next task for host managed-node3 26264 1727204283.97403: ^ task is: None 26264 1727204283.97405: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26264 1727204283.97405: done queuing things up, now waiting for results queue to drain 26264 1727204283.97406: results queue empty 26264 1727204283.97406: checking for any_errors_fatal 26264 1727204283.97407: done checking for any_errors_fatal 26264 1727204283.97407: checking for max_fail_percentage 26264 1727204283.97408: done checking for max_fail_percentage 26264 1727204283.97408: checking to see if all hosts have failed and the running result is not ok 26264 1727204283.97409: done checking to see if all hosts have failed 26264 1727204283.97410: getting the next task for host managed-node3 26264 1727204283.97411: done getting next task for host managed-node3 26264 1727204283.97411: ^ task is: None 26264 1727204283.97412: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node3 : ok=83 changed=3 unreachable=0 failed=0 skipped=73 rescued=0 ignored=1 Tuesday 24 September 2024 14:58:03 -0400 (0:00:00.937) 0:00:47.823 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.26s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which packages are installed --- 1.75s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.63s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.58s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.51s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 1.34s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.25s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.15s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Create veth interface lsr27 --------------------------------------------- 1.14s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.03s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.00s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.99s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.98s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.98s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Gathering Facts --------------------------------------------------------- 0.95s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Verify DNS and network connectivity ------------------------------------- 0.94s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Check if system is ostree ----------------------------------------------- 0.88s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Gathering Facts --------------------------------------------------------- 0.88s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 26264 1727204283.97504: RUNNING CLEANUP